Best way of restricting web access?

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Announcements
Please sign in to see details of an important advisory in our Customer Advisories area.

Best way of restricting web access?

Not applicable

Hi there,

Have a "interesting" problem.

Scope

* Clients are not to be allowed access to the internet. Restrict and control with firewall.

Scope creep

*Clients need access to Google to do a search, click on any links in that search. They will search for people/location

How can I best isolate and protect these clients web access? I've been trying to figure out what would be the best possible way to allow them a Google search access and 'click' access to the search that comes up.

Any input would be greatly appreciated and anything goes to try and mitigate the scope creep to try and be somewhat 'in line' with scope!


Cheers

A.

4 REPLIES 4

L6 Presenter

Filtering on dstip just wont work because Google uses shitloads of ip addresses.

Setup a custom url category which contains (for example) www.google.com (you might need to add a few other domains/subdomains/urls aswell) and enable ssl termination (so you can inspect encrypted https flows).

1) Then as first rule you deny stuff from Google which you dont wish to allow, such as Google Translate etc which can be used as proxy.

2) Then as second rule you allow by use the above custom url category along with appid for google search (if such exists).

You could also create a custom appid for this. Dont forget to set service to application-default (or manually to TCP80, TCP443).


Mmm how do I deal with that the user has to click on the actual search links that comes up? That means internet, I'm wondering if I could allow the actual search itself and then allow the "click" to any of the links, they'll have some 'google' link in them when clicked at first. (creating a new APP ID) then have an ultra restricitive web-browsing and not allow any files up or down. Has anyone tried to design something like this in the past?

Could using google webcache be sufficient for the users?

Because if you want them to be able to traverse to the searchresult then you must enable surfing.

You could perhaps setup a custom appid that will trigger on http-method: GET, HEAD, POST along with http-referer: ^http://www.google.com/ (or whatever domain google uses in your case). This appid could of course be easily bypassed by the client by sending a custom referer to fool the filter but still...

Will look into how to best use the appid, they'll click pre defined links so could use some filtering on that, they're happy to use cached web after their google search which means they can still 'surf' the internet but it'll be pretty cumbersome to do for them and it'll more then likely result in them not doing it on these machines. I'll update this thread with the final result.

Cheers and thanks

A

  • 2332 Views
  • 4 replies
  • 0 Likes
Like what you see?

Show your appreciation!

Click Like if a post is helpful to you or if you just want to show your support.

Click Accept as Solution to acknowledge that the answer to your question has been provided.

The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!

These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the LIVEcommunity as a whole!

The LIVEcommunity thanks you for your participation!