I'm new to this community. At the moment, we are actively exploring MineMeld in our environment and would like to know if there is any connectors available for Splunk to consume intel collected by MineMeld .
My name is Brian Torres-Gil and my team owns the Splunk integration at Palo Alto Networks. A Minemeld-Splunk integration is in the works, and I'd love to hear any use cases you have so we can ensure they're handled by the integration. Please tell me what you'd like to see from a Splunk integration with Minemeld and any problems you'd solve with it. This will really help us with the final design.
We will provide MineMeld as a Service for our PAN Firewall customers. Therefore it would be nice to see a graphical presentation of the currently connected Firewalls and to which feeds.
this would be a nice feature to have inside MineMeld. With the current release if you are already using Splunk or a system able to process syslog logs to create a dashboard, you can configure nginx on MineMeld to forward logs to an external syslog server. Using the nginx logs you can visualize and track firewalls connecting to the different feeds.
I would also be interested in using the minemeld app to ingest the node logs into Splunk, so that Splunk could have knowledge of the additions, updates, withdrawls, etc. occuring for each indicator.
are you interested in sending indicators updates/withdraws to Splunk ? Or using the MineMeld feeds as lookup tables inside Splunk ?
I was primarily interested in sending the updates/withdraws to Splunk. There's some hesitation to implementing dynamic block lists everywhere on our network and being able to audit the lists through a utility everyone is familiar with would do a lot to help assuage that.
I had been looking at just putting a forwarder on the minemeld instance, but the log files I found that appear to contain the logs read in by the MineMeld UI don't exclusively contain text? It looks like there's some binary data in there as well?
there are 2 things you could now for this:
1 - use the logstash output node to push indicators to LogStash and then configure logstash to forward the messages to Splunk. An open point here is the best format to be used on LogStash to push indicators to Splunk.
2 - use the minemeld-cef extension to generate messages in CEF format. My understanding is that Splunk can understand CEF
I found this page while looking at some Splunk/MineMeld integration post.
I wrote a series of blog posts on Threat Intelligence automation using MineMeld and Splunk
You can find here
on post 1 I show the architecture
on post 2 I show how-to write a custom prototype and the IoC integration with our SOC Splunk application. This is the fully automated near real feature we are using today to check IoC access.
on post 3 I show how-to create a STIX/TAXII output miner to export IoC
on post 4 I show how I integrated IoC events (updates/withdraw) into Splunk; to do this I wrote a TA to parse coming data (via logstash connector) and an app to show some stats (both on github).
Hope this is useful
Hi! I know I'm late to the party but I'd also like to monitor node updates coming from MM to Splunk, and I'm having trouble finding the right queries to do so.. propably due to the fact that we are very unknowledgeable concerning Splunk here hahahha.
Our 7.1 Splunk instance is connected to some MM outputs, and I can correctly find the indicators by using the | `mm_indicators` search or | from inputlookup:"minemeldfeeds_lookup" . What I need to do is compare last month's feeds to this month's feeds and return all the new indicators that have appeared in the last 30 days. All this is utlimately to compare to NGFW security policy hits within the last month to know if the new indicators have been hit or not.
Click Accept as Solution to acknowledge that the answer to your question has been provided.
The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!
These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the Live Community as a whole!
The Live Community thanks you for your participation!