- Access exclusive content
- Connect with peers
- Share your expertise
- Find support resources
11-30-2021 05:41 AM
VM-300, 10.0.8-h4 on KVM.
At one point issue with commit showed up:
Error: Error reading signature DFA data
failed to handle CONFIG_UPDATE_START
Also updates for Wildfire & Apps/Threats were not being installed. HA sync started to fail.
It was concluded that CTD resource usage is high - show system setting ctd state, Content Allocator Usage was 100%.
Restarting VM helped, but just for a while, almost instantly usage was back when tried to install dynamic updates and same result - commit failed.
Next step as per discussion found in the Live Community - Data Pattern Object was present on the firewall and it was deleted. Memory usage instantly dropped to 95% and after a while to 88% - commits, HA sync and updates are working as expected.
Tried the same on the lab VM (no configuration, pretty much empty VM) - by creating custom Data Pattern and Spyware object, installing all the updates, etc. I got the usage to 91%, so couldn't really reproduce the 100% usage and didn't get the commit to fail.
Anyway, currently using Data Patter object is not mandatory, but overall - has anyone seen something similar? Is there a solution to use the custom objects while still keeping the memory usage in order?
12-13-2021 05:21 AM
Good Day.
Without knowing the pattern of what the customer Data Filtering looks like, what the alert settings looked like, it does appear as if the Data Pattern was too general, hence CPU spike. I am not aware of any workaround, but it seems like better crafting the Data Filtering profile, pattern, alert numbers would be a start.
Click Accept as Solution to acknowledge that the answer to your question has been provided.
The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!
These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the LIVEcommunity as a whole!
The LIVEcommunity thanks you for your participation!