New Error Message I've Not Seen Before

L3 Networker

New Error Message I've Not Seen Before

Caught this in the temporary file for log processing...



# There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (mmap) failed to map 113700864 bytes for committing reserved memory.
# An error report file with more information is saved as:
# /tmp/hs_err_pid51410.log


Also, now, I have a situation where i cannot process any logs for at least one certain firewall, in that it seems to think there is a process already running for it. The pic shows that I cannot click the Process Files button (it never goes clickable) and there is a  link shown to the Spark processing status that is not actually reachable/running.





So, basically, i need to figure out how to clear this's survived a reboot already, so there has to be a file or something somewhere that I need to remove or something, I am guessing.

L4 Transporter

Re: New Error Message I've Not Seen Before

You can delete jobs by going to:


Settings-> Jobs -> Queue.

You can then select the job and look for that specific job and remove it.


Another option is to go to the Dashboard, and click on the "Remove all" button that you have for the agent. This will remove all jobs that are in queue, including the Spark jobs.


Regarding the origin of the issue, which was a lack of memory for JVM: How much RAM is your VM allocated? How much RAM does it state in the /home/userSpace/environmentParameters.php?

Like what you see?

Show your appreciation!

Click Like if a post is helpful to you or if you just want to show your support.

Click Accept as Solution to acknowledge that the answer to your question has been provided.

The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!

These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the Live Community as a whole!

The Live Community thanks you for your participation!