I played around on our lab FW a bit but couldn't get this working. Here are my objectives:
- Create a "White List" custom URL category that allows only a handful of web sites. (Working with URL Filtering profile.)
- Log all permits (Working. I got this by setting Action to alert)
- Create a "Black List" custom URL category that denies a bunch of "noisy" URLs without logging.
- Log all other denies (Working. I got this at the very end of my rules)
The goal is to create a log noise reducing rule so I can see denies that matter to me. I have tried several variations of rules and defining a separate URL policy. So far not much luck. Any tips for accomplishing this?
When you look at your traffic log, there is a column that shows the rule that is being hit. Do you see your Black List rule listed or is that rule being missed and it's instead hitting the bottom rule?
Generally, this should work the way you mention. The one caveat I can think of is the URL filtering logs. Those logs will show up no matter what, as a deny is a log action. There is no concept of denying a URL category and not logging it to the URL filtering rules. You may be able to get this to work by not using the URL filtering profile and instead selecting the categories in the rule itself with a deny action and unchecking the logging options on that rule, but I have not tried doing so yet.
Hope this helps,
Hello, if you put security rule with black list on top - without option log - it should avoid too many logs.
e.g. create custom category with black list as custom-deny-no log
add a security rule to block traffic with custom category and put it on top.
Thank you both for your response.
I added a line that proceeds my URL Whitelist Policy line that has a blacklist line like you indicated. It has a custom URL category defined, denies traffic, and is set to not log.
I am passing traffic through and it is getting denied, however it gets logged and never gets matched by the rule I created. I'm still troubleshooting but source/dest zone IP (any), category, application/service, and everything _should_ be matching it.
Probably the quickest work around for my solution is to add the following expression to my log viewing:
( category neq 'URL Blacklist' )
I think I can deal with that for now. Deny with no log would be icing on the cake but unless someone has a quick fix I'll leave it at that.
You could just create a DNS black list on your DNS server that points some place else for all of those sites its an internal address it won't hit your firewall.
Click Accept as Solution to acknowledge that the answer to your question has been provided.
The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!
These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the Live Community as a whole!
The Live Community thanks you for your participation!