The file a sent directly by the firewall and this is the serial number configured in Expedition. Panorama is not involved.
To get SCP to work I nee dto change oswer to expedition. Files then transfer without an issue but the files never show up under M.Learning to process via the Web interface (files are on Expedition and visiable via SSH shell).
Rich
If you are certain that the serial number is correct, then I would suggest to check the following.
I hope some of those points help.
There is only a single firewall involved. SCP from the firewall does not work unless go into the CLI and change the /PALogs owner in Expedition to expedition.
Right now under Settings the Temporary Data Structure Folder is set to /opt/ml (this is the ova install default). Do I need to change this to /PALogs files show up with thr web interface?
Thank you, Rich
The Temporary Data Structure Folder is used for conversion, which will come after you have managed to "find" the original CSV files.
In the main screen at Expedition, you have health checks. One of them refers to the Temporary Data Structure folder and the rights to write inside. If the check passes, then you do not need to make changes on your /opt/ml folder (unless you prefer a different folder due to space limitations).
Going back to the CSV files that can't be found, and located inside /PALogs, most probably you removed the rights for www-data to read that folder. Simply execute:
sudo chown expedition.www-data /PALogs
and later
sudo chmod -r 740 /PALogs
This will make expedition user the owner of the folder, and www-data group (which contains www-data user) the group owner of the folder. After, www-data group will have read rights into the folder, and expedition will have write-read-execute rights. If you would prefer, you can use 770 instead of 740 to give also write rights to www-data, in order to be able to compress the files after processing or delete them (those are options when processing csv files in Expedition)
Done and same issue.
expedition@pan-expedition:/PALogs$ ls -al
total 16
drwxr----- 2 expedition www-data 4096 Dec 31 11:45 .
drwxr-xr-x 24 root root 4096 Dec 28 11:50 ..
-rw-rw-r-- 1 expedition expedition 944 Dec 31 13:00 pan-panos-vm50_traffic_2018_12_31_last_calendar_day.csv
-rw-rw-r-- 1 expedition expedition 17 Dec 31 12:55 ssh-export-test.txt
Dashboard is clean: no errors to remediate. System looks good just cannot get files to show up under web interface to process.
Rich
TL;DR
One more thing, in the ML Settings, make sure the provided IP is the correct one.
Long Explanation
Even it may sound strange, we desinged Expedition to allow being split into two parts, the config management part, and the Machine Learning part.
In most cases (maybe 99%), both parts are the same machine. However, the management part needs to know how to reach the Machine Learning part for finding the CSV logs, converting them into parquets, performing data analytics to generate rules, etc.
Why do we desgined it this way? We had on mind that some users may require a very performing unit for data analytics, for instance with 24 CPU and 256GB of RAM. Maybe, they even have a cluster for processing Spark jobs (whic we use for Machine Learning). In that case, we started the desing to untangle Expedition into a heavy part (that could be shared with other projects and perform the data analytics) and a light part that handles configurations and rest of Expedition features.
it is- 192.168.55.120 (default IP address for LITB Expedition). Rich
If it still does not resolve the issue, please send us an email to fwmigrate at paloaltonetworks dot com, and we may try to do a live session to help you.
you didn't by chance fill up your drive on the expedition vm did you?
Click Accept as Solution to acknowledge that the answer to your question has been provided.
The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!
These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the LIVEcommunity as a whole!
The LIVEcommunity thanks you for your participation!