Imported CSV log files not processing

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Imported CSV log files not processing

L1 Bithead

Hi All,

  I have been having issues with logs files not showing up or being processed in ML.

Set up is Expedition on 20.04, connected to panorama to pull in devices.

Selected firewall set up to copy logs via SCP as my own user, which works fine.

Added cronjob to update permissions on CSV files.

sudo crontab -l -u root
[sudo] password for username:
# Edit this file to introduce tasks to be run by cron.
....Excluded....
# m h dom mon dow command
25 23 * * * php /var/www/html/OS/spark/scripts/changeCSVLogRights.php

In expedition expanding the panorama device and editing the firewall, creating API access.

Can pull config etc.

Have used ML to create rule sets and pushed back to firewall via API.

 

Since 12/10/21 logs stopped processing.

Two things changed that I can think of, scp user changed from expedition to my own.

Upgraded from 1.1.111 to 1.1.112 and then 1.1.113 yesterday to see if it was fixed in the newest version..

In the WebUI you don't see the logs.

On the CLI can see the logs but they are still owned by my username (user and group).

So the update permissions script is not working.

grep chown /var/www/html/OS/spark/scripts/changeCSVLogRights.php
$command = "chown expedition:www-data ".$newFile."; chmod 660 ".$newFile.";";

 

Manually updating files to be owned by expedition and have a group of www-data  and maks of 660 as per the script doesn't seem to fix it.

I had a local install running before deploying in prod and that has the file owned by www-data and group of expedition (i.e. backwards compared to the documentation and script).

Changing this also does not fix it.

However deleting the files under /data via Settings > M.Learning > Data analysis structures folder: DELETE Connection.parquet FILES, then searching for files under the device brings up all logs files available to be processed.

 

What are the file permissions supposed to be?

Why would I need to delete the /data/ files before the logs are seen by the webui for processing.

 

$ ls -ld /PALogs/
drwxr-xr-x 3 www-data www-data 4096 Oct 19 13:53 /PALogs/

 

/PALogs$ ls -l

-rw-rw---- 1 www-data expedition 8365183605 Oct 18 20:17 FW-01_traffic_2021_10_18_last_calendar_day.csv

 

 

 

1 accepted solution

Accepted Solutions

L1 Bithead

I ended up just running my own cron job (hourly) to change ownership to www-data expedition.

This made jobs start running but a day behind.

i.e. log in every day and there is always a csv to process even through the export completes hours before the log auto processing time.

Have since upgraded to 1.2.0, rolled back to 1.1.113 due to the install issue and then forward again to 1.2.0.

Logs are now processing overnight as expected.

View solution in original post

1 REPLY 1

L1 Bithead

I ended up just running my own cron job (hourly) to change ownership to www-data expedition.

This made jobs start running but a day behind.

i.e. log in every day and there is always a csv to process even through the export completes hours before the log auto processing time.

Have since upgraded to 1.2.0, rolled back to 1.1.113 due to the install issue and then forward again to 1.2.0.

Logs are now processing overnight as expected.

  • 1 accepted solution
  • 2395 Views
  • 1 replies
  • 0 Likes
  • 77 Subscriptions
Like what you see?

Show your appreciation!

Click Like if a post is helpful to you or if you just want to show your support.

Click Accept as Solution to acknowledge that the answer to your question has been provided.

The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!

These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the LIVEcommunity as a whole!

The LIVEcommunity thanks you for your participation!