celery-worker 100% cpu usage

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Announcements
Please sign in to see details of an important advisory in our Customer Advisories area.

celery-worker 100% cpu usage

L1 Bithead

Hi

 

I've had 100% CPU usage from the celery-worker processes for a few weeks now, I spent some time trying to resolve this but without resolution.

 

I'm using a standard Ubuntu 14 appliance build.

 

The issue is a looping of emerging threats downloading, rendering, repeat.  I don't have the db permission issue some other users have reported, where the rendering is unsuccessful because the celery-worker doesn't have permission to the DB. The rendering is successful, unfortunately it just repeats as soon as it successfully renders. I have 3 workers (3 CPUs) and all 3 workers consistently loop like this.

 

[2018-11-16 10:38:08,510: WARNING/Worker-3] Imported 15500 rules so far...
[2018-11-16 10:38:09,550: WARNING/Worker-3] Imported 16000 rules so far...
[2018-11-16 10:38:10,567: WARNING/Worker-3] Imported 16500 rules so far...
[2018-11-16 10:38:11,592: WARNING/Worker-3] Imported 17000 rules so far...
[2018-11-16 10:38:12,651: WARNING/Worker-3] Imported 17500 rules so far...
[2018-11-16 10:38:13,664: WARNING/Worker-3] Imported 18000 rules so far...
[2018-11-16 10:38:14,691: WARNING/Worker-3] Imported 18500 rules so far...
[2018-11-16 10:38:15,707: WARNING/Worker-3] Imported 19000 rules so far...
[2018-11-16 10:38:16,043: WARNING/Worker-3] Finished Importing 19142 rules. Committing data
[2018-11-16 10:38:16,044: INFO/Worker-3] Rendering rules.

==> mhn.log <==
2018-11-16 10:33:50,193 - /opt/mhn/server/mhn/tasks/rules.py - Fetching sources from 1 sources.
2018-11-16 10:33:50,195 - /opt/mhn/server/mhn/tasks/rules.py - Downloading from "http://rules.emergingthreats.net/open/snort-2.9.0/emerging.rules.tar.gz".
2018-11-16 10:34:16,022 - /opt/mhn/server/mhn/tasks/rules.py - Bulk importing 19142 rules.
2018-11-16 10:35:08,829 - /opt/mhn/server/mhn/tasks/rules.py - Rendering rules.
2018-11-16 10:37:10,118 - /opt/mhn/server/mhn/tasks/rules.py - Fetching sources from 1 sources.
2018-11-16 10:37:10,120 - /opt/mhn/server/mhn/tasks/rules.py - Downloading from "http://rules.emergingthreats.net/open/snort-2.9.0/emerging.rules.tar.gz".
2018-11-16 10:37:36,014 - /opt/mhn/server/mhn/tasks/rules.py - Bulk importing 19142 rules.
2018-11-16 10:38:16,044 - /opt/mhn/server/mhn/tasks/rules.py - Rendering rules.

 

I've tried clearing out the celery queue, this also didn't seem to help. I've also tried the normal restarting and stopping the celery worker processes, however as soon as celery-beat is restarted, this looping begins again.

 

Thanks for any help or suggestions, I'm stuck

1 accepted solution

Accepted Solutions

L7 Applicator

Hi @jtrevaskis,

MineMeld is not using celery. I guess you have installed MHN on your server and that is giving high CPU, but that's not related to MineMeld.

 

Luigi

View solution in original post

2 REPLIES 2

L7 Applicator

Hi @jtrevaskis,

MineMeld is not using celery. I guess you have installed MHN on your server and that is giving high CPU, but that's not related to MineMeld.

 

Luigi

oh of course your right, its MHN

  • 1 accepted solution
  • 4896 Views
  • 2 replies
  • 0 Likes
Like what you see?

Show your appreciation!

Click Like if a post is helpful to you or if you just want to show your support.

Click Accept as Solution to acknowledge that the answer to your question has been provided.

The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!

These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the LIVEcommunity as a whole!

The LIVEcommunity thanks you for your participation!