06-26-2018 07:43 AM - edited 06-27-2018 07:55 AM
Hi everyone,
I have added firewall logs from our Palo Alto 5000 series to the Expedition VM /PALogs . I have copied the orginal .csv as a duplicate with root as the owner and the original with expedition as the owner. Both files appear in Devices > M.LEARNING. When I run Process Files the job remains in pending and nothing happens. Any ideas what the issue may be?
01-30-2020 07:09 AM
Ran a new file this morning and received the same error. The "1580390733_traffic_files.csv" is empty as you stated it would be.
---- CREATING SPARK Session:
warehouseLocation:/data/spark-warehouse
+--------+--------+-------+----+------------+
|fwSerial|panosver|csvpath|size|afterProcess|
+--------+--------+-------+----+------------+
+--------+--------+-------+----+------------+
Memory: 5838m
LogCollector&Compacter called with the following parameters:
Parameters for execution
Master[processes]:............ local[3]
Available RAM (MB):........... 5978112
User:......................... admin
debug:........................ false
Parameters for Job Connections
Task ID:...................... 2162
My IP:........................ 10.170.1.35
Expedition IP:................ 10.170.1.35:3306
Time Zone:.................... Europe/Helsinki
dbUser (dbPassword):.......... root (************)
projectName:.................. demo
Parameters for Data Sources
App Categories (source):........ (Expedition)
CSV Files Path:................./tmp/1580390733_traffic_files.csv
Parquet output path:.......... file:///data/connections.parquet
Temporary folder:............. /data
---- AppID DB LOAD:
Application Categories loading...
Application Categories loaded
Exception in thread "main" java.util.NoSuchElementException: next on empty iterator
at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)
at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)
at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:63)
at scala.collection.IterableLike$class.head(IterableLike.scala:107)
at scala.collection.mutable.ArrayOps$ofInt.scala$collection$IndexedSeqOptimized$$super$head(ArrayOps.scala:234)
at scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.scala:126)
at scala.collection.mutable.ArrayOps$ofInt.head(ArrayOps.scala:234)
at com.paloaltonetworks.tbd.LogCollectorCompacter$.main(LogCollectorCompacter.scala:441)
at com.paloaltonetworks.tbd.LogCollectorCompacter.main(LogCollectorCompacter.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
ubuntu@ip-10-170-1-35:/tmp$
ubuntu@ip-10-170-1-35:/tmp$ more 1580390733_traffic_files.csv
fwSerial,panosver,csvpath,size,afterProcess
/
ubuntu@ip-10-170-1-35:/tmp$
01-30-2020 07:26 AM
Until we do a Zoom session, would it be possible that the firewall log files are actually empty?
The best would be to check it live and discard other options
Click Accept as Solution to acknowledge that the answer to your question has been provided.
The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!
These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the LIVEcommunity as a whole!
The LIVEcommunity thanks you for your participation!