URL Allow List filters are not functioning as documented.

Announcements

ATTENTION Customers, All Partners and Employees: The Customer Support Portal (CSP) will be undergoing maintenance and unavailable on Saturday, November 7, 2020, from 11 am to 11 pm PST. Please read our blog for more information.

Reply
Highlighted
Not applicable

URL Allow List filters are not functioning as documented.

URL filter is great when working with Categories, but when an exception is published in the allow list there are catches and exceptions.:smileyconfused:

We block-continue Streaming Media, which contains Youtube, which we want to allow users access to without a block.

I create an allow exception *.youtube.com/* and commit, open a new browser, clear the cache, and go to www,youtube.com to see that B&C is still in effect.

So I added www.youtube.com and youtube.com in addition, and then it worked.

I did the same thing for dropbox, but it uses HTTPS and the session would be dropped every time I went to www.dropbox.com - which in turn redirected me to https://dropbox.com

In order to get that working, I had to go against the documention in the dialog box and add https:// to the dropbox.com.

still haven't got the dropbox desktop application to work, even with *.dropbox.com in the allow list.

This is an exercise in frustration!

Tags (1)
Highlighted
L4 Transporter

You might want to read the following Document on URL categorization that has a description on how the PAN parses URL filters.

https://live.paloaltonetworks.com/docs/DOC-3264

The inherent vice of capitalism is the unequal sharing of blessings; the inherent virtue of socialism is the equal sharing of miseries.
Highlighted
L3 Networker

If you need allowing a domain with its subdomain you must allow them

domain.*

*.domain.*

*.*domain.*

*.*.*domain.*

etc

This works for web-browsing, if the traffic is ssl you need to terminate it (decryptin) in order to gain full compatibility. If not only the domain.* is visible.

In case of known application (as dropbox)  I suggest you not to use Url, use application identification, always go beyond Url Profile.

Highlighted
L6 Presenter

Reading that technote it really sounds odd that PA has chosen this behaviour.

The common thing in the market is to separate "domain.com" from "*.domain.com" but this also means that "*.domain.com" would match "sub1.sub2.domain.com".

That is to fully block for example youtube by url filter you need two url-rules:

1) youtube.com (this covers http(s)://youtube.com/*)

2) *.youtube.com (this covers http(s)://sub1.youtube.com/* but also http(s)://sub1.sub2.youtube.com/* and so on)

Could perhaps someone from PaloAlto enlighten us on this subject?

The point of using url-filtering and not entirely rely on appid is what happend last xmas:

http://researchcenter.paloaltonetworks.com/2012/12/app-id-cache-pollution-response/

http://researchcenter.paloaltonetworks.com/2013/01/app-id-cache-pollution-update/

The point here is that next time appid is failing (because it most likely will otherwise there wouldnt have to be any updates for the app-db =) the url-filtering in the same security policy (specially if its a whitelisting rule) might save you...

That is compare a security rule which only has "appid:facebook" with one that has both "appid:facebook" AND "url:facebook.com||*.facebook.com" (or whatever domains they use nowadays).

Like what you see?

Show your appreciation!

Click Like if a post is helpful to you or if you just want to show your support.

Click Accept as Solution to acknowledge that the answer to your question has been provided.

The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!

These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the Live Community as a whole!

The Live Community thanks you for your participation!