jump to navigation

Restrict external access to PeopleSoft with Squid June 8, 2011

Posted by Duncan in Infrastructure, PeopleSoft.

I recently had to expose a client’s PeopleSoft installation to the outside world, which I did in the usual manner (additional PIA in the DMZ etc).

We wanted to use the “closed by default, open by exception” approach, so we would start by blocking access to everything and then open the areas we needed access to URL by URL.  I suspected that the final ‘URL Whitelist’ might take many iterations to get right and as the Reverse Proxy in the DMZ was outside of my control I needed to trial it somewhere else first.

I commandeered one of our less frequently used environments and went about searching for a quick/free method of blocking access.  After trying a few different approaches I settled on Squid, the open-source forward-proxy / web-caching server.  Although it’s better known for running on Unix systems, there is a Windows implementation and it can operate perfectly well as a reverse-proxy.

Setting up Squid

Once I’d downloaded and unzipped the binaries, and installed it as a service (using this helpful write-up as a guide) it was just a case of setting the rules.

In the ACLs section I added my bad and good URLs:

acl bad_url urlpath_regex *DEV*
acl good_url urlpath_regex "c:\squid\etc\good-urls.squid"

This would block any URL with DEV in (my chosen environment was DEV), but then allow any URLs in the ‘good-urls.squid’ file.  I then had specify in the http_access section what to do with these ACL groups.

http_access allow good_url
http_access deny bad_url
http_access allow all

It took me a few goes to get this right as the last line confused me for a while, but luckily there are copious notes in the provided .conf file:

If none of the “access” lines cause a match, the default is the opposite of the last line in the list.  If the last line was deny, the default is allow. Conversely, if the last line is allow, the default will be deny.

I was happy leaving my PeopleSoft environment on port 80 and Squid on 3128 as this is just a temporary setup for my testing.  Obviously Squid would be on port 80 if this was a production setup.

I amended the default port line thus:

http_port 3128 defaultsite=xxx.yyy.com

(where xxx is the hostname and yyy is the domain name)

And finally I added this line:

cache_peer parent 80 0 originserver default

I used as Squid is on the same host as the PIA, and the rest is for forwarding.

Setup PeopleSoft

In the Web Profile ‘Virtual Addressing’ tab, add the reverse proxy details.  This willensure that PeopleSoft uses the reverse-proxy port number.  Bounce the PIA.

Custom Error Page

If you want a nice custom ‘Access Denied’ page instead of the default Squid one, they can be found in ‘C:\squid\share\errors\English’.  They have no file extension, but they’re HTML so a cinch to amend.

Building up the good-urls.squid file

This is largely going to vary depending upon what you want to expose to the external users.  A lot of what we opened up were custom pages so there isn’t a lot of value sharing the full file here.  Having said that, here is a snippet of our file:


Lines 1 and 2 sort out the signon page.

Line 3 is the Employee Portal homepage.

Lines 4 and 5 are for images.  Lines 6 and 8 are for viewing attachments and the Rich Text editor.

Lines 7, 9 and 10 are sample PeopleSoft pages/components.

The remainder deal with the timeout and signout links.

(Assuming that your PIA site is ‘ps’)


And you’re done.  There are a few little quirks to note.

Firstly, every time you change your URLs file you’ll need to restart the Squid service, but it’s a quick process so doesn’t hold you up too much.

Secondly, PeopleSoft frequently uses the ‘?’ special character as a URL delimiter so Squid only matches against the characters before this point.  There are several occasions when you need to match against the full URL which is why I’ve used url_path_regex in the ACL section above.  This allowed me to escape the special characters so that the log-out, time-out and view attachment links work ok.



1. Richard - June 8, 2011

Have you deliberately excluded your good-urls.squid file from this post? I can imagine with the special web profile pages, servlet directives and caching, the list would end up quite elaborate, even just to expose 2-3 components.

If it’s not posted here, could you consider posting a subset of the file that exposes a few components, and all the required weblibs and other ‘specials’?

2. Tipster - June 8, 2011

Good point Richard, and it’s probably not as bad as you fear. I’ve updated the post with a good chunk of our file. Our complete file only contains ~60 URLs, and those could be reduced further with some skillful regex’ing.

Sorry comments are closed for this entry

%d bloggers like this: