Find Blocked URLs when Geo Blocking

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Announcements
Palo Alto Networks Approved
Palo Alto Networks Approved
Community Expert Verified
Community Expert Verified

Find Blocked URLs when Geo Blocking

L0 Member

Hi Everyone,

 

We've implemented geo blocking as a way to help secure our environment in PAN-OS 10.1, with an exception/allow rule for a specific group of URLs.  We sometimes run into a case where portions of a web site display fine, but other portions do not display at all.  Currently, we sift through our browser development tools (by pressing F12) to manually track down the URLs we think are being dropped. This gets pretty tedious and time consuming as we're not experts in the various browser dev tools panel.  What's a better method of finding those 'hidden' URLs to add to our exception rule? I'm open to suggestions within the Palo, or a 3rd party tool to assist with this.

4 REPLIES 4

Cyber Elite
Cyber Elite

@roryschmitz,

The problem that you'll run into is that the firewall isn't going to see the URL since you're resetting at the IP level consistently since you're resetting this at the IP level. What I've done in the past is build out a machine(s) that people can access that isn't subject to the geo restriction. Then just ensure that you have that traffic logging everything so you can build out the exceptions that you need to have in place. 

You could obviously also choose to use a combination of machine/user or just solely have a test user account that bypasses these restrictions as well if you don't want to just limit it to a few machines across the environment. Just some design thought that needs to be employed in that aspect. 

L6 Presenter

Currently going thru this as Akamai had some event yesterday that caused them to shift CDN content to alternate datacenters all over the world. The easiest way I have found to find the potentially offending IPs is to open Developer Tools in Chrome. Go to the non-functioning webpage, then go to the "Network" tab under Developer Tools. You should see multiple page assets in red indicating they were blocked/failed to download. Clicking on the asset will show the full URL and the IP address the browser attempted to connect to. Once you have the failed URLs/IP addresses you can check the PA logs to confirm that was blocked do to geocoding/etc.

 

For the Akamai and other CDNs that have world-wide presence, I have created a custom "CDN" geocode to always allow these thru the destination IP/region restrictions. So, for instance, with Akamai failing yesterday (because they moved normally US-hosted content to Sweden/India/Hong Kong):

Objects -> Regions

name = CDN

address = 23.32.0.0/11  23.64.0.0/14  23.72.0.0/13

 

Then add the "CDN" region to your Security Policy allowing internet access to certain destination geocodes.

L0 Member

@BPry That's a good suggestion and I'll test a few methods.  My initial thinking is bypassing based off of identity/user account.  We use Windows machines so running the browser under a different User ID (shift+right-click > Run as different user) may be a nice way to accomplish that. We'd then have the flexibility to bypass on any machine at any time.

 

@Adrian_Jensen, Thanks for those instructions. I was able to find the problem URL easier this way and have one site working fine now. We too, find various CDN's will eventually re-route traffic to other countries.  I'm clearly no expert, but for your Akamai example, could you instead use the pre-defined application 'akamai-client' (port 80/tcp) and allow it that way vs. IP-based? I'm wondering if there's an EDL for Akamai them as well...  

 

Side note, I did see another thread/suggestion of allowing the traffic at layer 2-3, then blocking it later in the policy at layer 7 just to gather the URLs.  For security, I'd rather not even allow sessions of potential bad actors to get to that point. I can only assume this consumes more resources on the firewall for the extra processing.

L6 Presenter

@roryschmitz I don't believe filtering on application "akamai-client" would work because it is just "web-browsing" (akamai-client seems to be an additional client library for special "enhanced" downloading). In this case I think that it is just that the content happens to be hosted on an Akamai CDN server, not that there is anything unique with the content or client downloading.

 

It also depends on how the web page is coded. If the embedded assets have a URL of xxx.yyy.akamai.net/ then you can URL filtering based on Akamai. But if the assets have a URL of xxx.yyy.example.com/ that eventually resolves to an Akamai CDN IP through a series of DNS CNAME redirects, then the PA is never going to see the akamai.net URL... it sees the original URL request to example.com/ and a destination IP that it geocodes to where ever.

  • 2825 Views
  • 4 replies
  • 0 Likes
Like what you see?

Show your appreciation!

Click Like if a post is helpful to you or if you just want to show your support.

Click Accept as Solution to acknowledge that the answer to your question has been provided.

The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!

These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the LIVEcommunity as a whole!

The LIVEcommunity thanks you for your participation!