ML producing no results

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Announcements

ML producing no results

L1 Bithead

I updated and have a fresh install at 1.1.5

Was originally getting errors even processing logs but found the article to comment out the bind_ip 127.0.0.1 in my.cnf

 

Now it will sucesffully process the logs but when I choose the log files and set to ML then "Discovery > machine learning" and analyize I get no results.

Process is as follows

Start new project (dont get option to choose Purpose anymore? its auto set to Migration?)

Set api keys and grab config

Dump logs via scheduled log export as well as trying expoprt via CSV from monitor tab

Go to device and process logs (Completes)

Go to project and import config

Go to policies and choose correct Vsys as well as set policies to ML

Go to dicovery and create log connector to correct device and Vsys

Hit analyse and watch it complete its tasks

Says complete and there is no data

 

Any help would be great

Senior Security Engineer
mydatapath.com
1 accepted solution

Accepted Solutions

After many hours I finally got it

 

Not sure what the issue was, maybe spark was corrupted. I did expand the LVM to 40 gigs since the dumps I was using are well over 10 gigs in size.

 

Re created the VM and all is working now

 

Can someone tell me how to add more CPU resources to expedition. Ive added 4 cores to it but it only sees 1

 

Thanks

Senior Security Engineer
mydatapath.com

View solution in original post

7 REPLIES 7

L5 Sessionator

When you imported the configuration from the PANOS device, was via the device itself, and not via the XML upload config, right?

That would be necessary to do a proper mapping between the config and the Device Serial that will be used to check the logs.

 

Also, are we assuming that the selected rule got traffic reported for the selected period of time in the log connector?

 

And, make sure that you are running the latest version of Expedition 1.1.4

Thanks for the reply,

 

I used the device itself to import the config, did not do a XML import.

 

The dump from the 5220 was over 10 gigs (Scheduled log export) and I also just dumped the traffic log from the monitor tab to csv.

I choose a date range of the day of the export, a week the export was in, month & year all with no results.

Also tried another PAN 850 importing same way and got same results.

Im on expedition 1.1.5 currently but to be honest im at the point where im going to spin up another expedition server to see if that helps.

This version im using is the OVA that was given out on this forum so I will just spin up another one and test that.

 

N8

Senior Security Engineer
mydatapath.com

Let us know at fwmigrate at paloaltonetworks dot com if it still fails with the new image.

 

At that point, we would require some log files left in /tmp.

 

Best

Finally got it up and running

 

Stuck at pending, checked my.cnf and bind_ip was already commented out

Updated to 1.1.5 and now no web interface

Senior Security Engineer
mydatapath.com

After many hours I finally got it

 

Not sure what the issue was, maybe spark was corrupted. I did expand the LVM to 40 gigs since the dumps I was using are well over 10 gigs in size.

 

Re created the VM and all is working now

 

Can someone tell me how to add more CPU resources to expedition. Ive added 4 cores to it but it only sees 1

 

Thanks

Senior Security Engineer
mydatapath.com

You sir.... are awesome.

 

Thank you

Senior Security Engineer
mydatapath.com
  • 1 accepted solution
  • 6621 Views
  • 7 replies
  • 0 Likes
Like what you see?

Show your appreciation!

Click Like if a post is helpful to you or if you just want to show your support.

Click Accept as Solution to acknowledge that the answer to your question has been provided.

The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!

These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the LIVEcommunity as a whole!

The LIVEcommunity thanks you for your participation!