<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic ML Jobs failing on 1.1.48.2 in Expedition Discussions</title>
    <link>https://live.paloaltonetworks.com/t5/expedition-discussions/ml-jobs-failing-on-1-1-48-2/m-p/298773#M2137</link>
    <description>&lt;P&gt;Hello all&lt;/P&gt;
&lt;P&gt;Our Expedition instance failed to compress ML logs after processing and ran out of storage. I noticed this too late,&amp;nbsp; after couple days of it not working.&lt;/P&gt;
&lt;P&gt;I cleared out space by deleting old&amp;nbsp; compressed files and manually gzip'd processed but not compressed old csv files.&lt;/P&gt;
&lt;P&gt;Then I upgraded to latest version 1.1.48.2.&lt;/P&gt;
&lt;P&gt;When try to process files now it fails instantly, I'm sharing the content of the /tmp/error_logCoCo file&lt;/P&gt;
&lt;P&gt;Any ideas what could be wrong?&lt;/P&gt;
&lt;P&gt;Thanks&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;root@Expedition:/storage/ExpeditionLogs# cat /tmp/error_logCoCo&lt;BR /&gt;SLF4J: Class path contains multiple SLF4J bindings.&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/extraLibraries/slf4j-nop-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.&lt;BR /&gt;SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]&lt;BR /&gt;---- CREATING SPARK Session:&lt;BR /&gt;warehouseLocation:/work/Parquet/tempdata/spark-warehouse&lt;BR /&gt;SLF4J: Class path contains multiple SLF4J bindings.&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/extraLibraries/slf4j-nop-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.&lt;BR /&gt;SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]&lt;BR /&gt;---- CREATING SPARK Session:&lt;BR /&gt;warehouseLocation:/work/Parquet/tempdata/spark-warehouse&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;|fwSerial|panosver|csvpath|size|afterProcess|&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;/P&gt;
&lt;P&gt;Memory: 12837m&lt;BR /&gt;LogCollector&amp;amp;Compacter called with the following parameters:&lt;BR /&gt;Parameters for execution&lt;BR /&gt;Master[processes]:............ local[6]&lt;BR /&gt;Available RAM (MB):........... 13145088&lt;BR /&gt;User:......................... N62231&lt;BR /&gt;debug:........................ false&lt;BR /&gt;Parameters for Job Connections&lt;BR /&gt;Task ID:...................... 1510&lt;BR /&gt;My IP:........................ 172.16.201.223&lt;BR /&gt;Expedition IP:................ 172.16.201.223:3306&lt;BR /&gt;Time Zone:.................... Europe/Helsinki&lt;BR /&gt;dbUser (dbPassword):.......... root (************)&lt;BR /&gt;projectName:.................. demo&lt;BR /&gt;Parameters for Data Sources&lt;BR /&gt;App Categories (source):........ (Expedition)&lt;BR /&gt;CSV Files Path:................./tmp/1573805842_traffic_files.csv&lt;BR /&gt;Parquet output path:.......... file:///work/Parquet/connections.parquet&lt;BR /&gt;Temporary folder:............. /work/Parquet/tempdata&lt;BR /&gt;---- AppID DB LOAD:&lt;BR /&gt;Application Categories loading...&lt;BR /&gt;Application Categories loaded&lt;/P&gt;
&lt;P&gt;Exception in thread "main" java.util.NoSuchElementException: next on empty iterator&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)&lt;BR /&gt;at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:63)&lt;BR /&gt;at scala.collection.IterableLike$class.head(IterableLike.scala:107)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.scala$collection$IndexedSeqOptimized$$super$head(ArrayOps.scala:234)&lt;BR /&gt;at scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.scala:126)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.head(ArrayOps.scala:234)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter$.main(LogCollectorCompacter.scala:441)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter.main(LogCollectorCompacter.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;|fwSerial|panosver|csvpath|size|afterProcess|&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;/P&gt;
&lt;P&gt;Memory: 12837m&lt;BR /&gt;LogCollector&amp;amp;Compacter called with the following parameters:&lt;BR /&gt;Parameters for execution&lt;BR /&gt;Master[processes]:............ local[6]&lt;BR /&gt;Available RAM (MB):........... 13145088&lt;BR /&gt;User:......................... N62231&lt;BR /&gt;debug:........................ false&lt;BR /&gt;Parameters for Job Connections&lt;BR /&gt;Task ID:...................... 1511&lt;BR /&gt;My IP:........................ 172.16.201.223&lt;BR /&gt;Expedition IP:................ 172.16.201.223:3306&lt;BR /&gt;Time Zone:.................... Europe/Helsinki&lt;BR /&gt;dbUser (dbPassword):.......... root (************)&lt;BR /&gt;projectName:.................. demo&lt;BR /&gt;Parameters for Data Sources&lt;BR /&gt;App Categories (source):........ (Expedition)&lt;BR /&gt;CSV Files Path:................./tmp/1573805850_traffic_files.csv&lt;BR /&gt;Parquet output path:.......... file:///work/Parquet/connections.parquet&lt;BR /&gt;Temporary folder:............. /work/Parquet/tempdata&lt;BR /&gt;---- AppID DB LOAD:&lt;BR /&gt;Application Categories loading...&lt;BR /&gt;Application Categories loaded&lt;/P&gt;
&lt;P&gt;Exception in thread "main" java.util.NoSuchElementException: next on empty iterator&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)&lt;BR /&gt;at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:63)&lt;BR /&gt;at scala.collection.IterableLike$class.head(IterableLike.scala:107)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.scala$collection$IndexedSeqOptimized$$super$head(ArrayOps.scala:234)&lt;BR /&gt;at scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.scala:126)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.head(ArrayOps.scala:234)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter$.main(LogCollectorCompacter.scala:441)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter.main(LogCollectorCompacter.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;SLF4J: Class path contains multiple SLF4J bindings.&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/extraLibraries/slf4j-nop-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.&lt;BR /&gt;SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]&lt;BR /&gt;---- CREATING SPARK Session:&lt;BR /&gt;warehouseLocation:/work/Parquet/tempdata/spark-warehouse&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;|fwSerial|panosver|csvpath|size|afterProcess|&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;/P&gt;
&lt;P&gt;Memory: 12837m&lt;BR /&gt;LogCollector&amp;amp;Compacter called with the following parameters:&lt;BR /&gt;Parameters for execution&lt;BR /&gt;Master[processes]:............ local[6]&lt;BR /&gt;Available RAM (MB):........... 13145088&lt;BR /&gt;User:......................... N62231&lt;BR /&gt;debug:........................ false&lt;BR /&gt;Parameters for Job Connections&lt;BR /&gt;Task ID:...................... 1512&lt;BR /&gt;My IP:........................ 172.16.201.223&lt;BR /&gt;Expedition IP:................ 172.16.201.223:3306&lt;BR /&gt;Time Zone:.................... Europe/Helsinki&lt;BR /&gt;dbUser (dbPassword):.......... root (************)&lt;BR /&gt;projectName:.................. demo&lt;BR /&gt;Parameters for Data Sources&lt;BR /&gt;App Categories (source):........ (Expedition)&lt;BR /&gt;CSV Files Path:................./tmp/1573806573_traffic_files.csv&lt;BR /&gt;Parquet output path:.......... file:///work/Parquet/connections.parquet&lt;BR /&gt;Temporary folder:............. /work/Parquet/tempdata&lt;BR /&gt;---- AppID DB LOAD:&lt;BR /&gt;Application Categories loading...&lt;BR /&gt;Application Categories loaded&lt;/P&gt;
&lt;P&gt;Exception in thread "main" java.util.NoSuchElementException: next on empty iterator&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)&lt;BR /&gt;at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:63)&lt;BR /&gt;at scala.collection.IterableLike$class.head(IterableLike.scala:107)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.scala$collection$IndexedSeqOptimized$$super$head(ArrayOps.scala:234)&lt;BR /&gt;at scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.scala:126)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.head(ArrayOps.scala:234)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter$.main(LogCollectorCompacter.scala:441)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter.main(LogCollectorCompacter.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;SLF4J: Class path contains multiple SLF4J bindings.&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/extraLibraries/slf4j-nop-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.&lt;BR /&gt;SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]&lt;BR /&gt;---- CREATING SPARK Session:&lt;BR /&gt;warehouseLocation:/work/Parquet/tempdata/spark-warehouse&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;|fwSerial|panosver|csvpath|size|afterProcess|&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;/P&gt;
&lt;P&gt;Memory: 12837m&lt;BR /&gt;LogCollector&amp;amp;Compacter called with the following parameters:&lt;BR /&gt;Parameters for execution&lt;BR /&gt;Master[processes]:............ local[6]&lt;BR /&gt;Available RAM (MB):........... 13145088&lt;BR /&gt;User:......................... N62231&lt;BR /&gt;debug:........................ false&lt;BR /&gt;Parameters for Job Connections&lt;BR /&gt;Task ID:...................... 1513&lt;BR /&gt;My IP:........................ 172.16.201.223&lt;BR /&gt;Expedition IP:................ 172.16.201.223:3306&lt;BR /&gt;Time Zone:.................... Europe/Helsinki&lt;BR /&gt;dbUser (dbPassword):.......... root (************)&lt;BR /&gt;projectName:.................. demo&lt;BR /&gt;Parameters for Data Sources&lt;BR /&gt;App Categories (source):........ (Expedition)&lt;BR /&gt;CSV Files Path:................./tmp/1573806922_traffic_files.csv&lt;BR /&gt;Parquet output path:.......... file:///work/Parquet/connections.parquet&lt;BR /&gt;Temporary folder:............. /work/Parquet/tempdata&lt;BR /&gt;---- AppID DB LOAD:&lt;BR /&gt;Application Categories loading...&lt;BR /&gt;Application Categories loaded&lt;/P&gt;
&lt;P&gt;Exception in thread "main" java.util.NoSuchElementException: next on empty iterator&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)&lt;BR /&gt;at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:63)&lt;BR /&gt;at scala.collection.IterableLike$class.head(IterableLike.scala:107)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.scala$collection$IndexedSeqOptimized$$super$head(ArrayOps.scala:234)&lt;BR /&gt;at scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.scala:126)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.head(ArrayOps.scala:234)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter$.main(LogCollectorCompacter.scala:441)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter.main(LogCollectorCompacter.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;SLF4J: Class path contains multiple SLF4J bindings.&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/extraLibraries/slf4j-nop-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.&lt;BR /&gt;SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]&lt;BR /&gt;---- CREATING SPARK Session:&lt;BR /&gt;warehouseLocation:/work/Parquet/tempdata/spark-warehouse&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;|fwSerial|panosver|csvpath|size|afterProcess|&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;/P&gt;
&lt;P&gt;Memory: 12837m&lt;BR /&gt;LogCollector&amp;amp;Compacter called with the following parameters:&lt;BR /&gt;Parameters for execution&lt;BR /&gt;Master[processes]:............ local[6]&lt;BR /&gt;Available RAM (MB):........... 13145088&lt;BR /&gt;User:......................... N62231&lt;BR /&gt;debug:........................ false&lt;BR /&gt;Parameters for Job Connections&lt;BR /&gt;Task ID:...................... 1514&lt;BR /&gt;My IP:........................ 172.16.201.223&lt;BR /&gt;Expedition IP:................ 172.16.201.223:3306&lt;BR /&gt;Time Zone:.................... Europe/Helsinki&lt;BR /&gt;dbUser (dbPassword):.......... root (************)&lt;BR /&gt;projectName:.................. demo&lt;BR /&gt;Parameters for Data Sources&lt;BR /&gt;App Categories (source):........ (Expedition)&lt;BR /&gt;CSV Files Path:................./tmp/1573806951_traffic_files.csv&lt;BR /&gt;Parquet output path:.......... file:///work/Parquet/connections.parquet&lt;BR /&gt;Temporary folder:............. /work/Parquet/tempdata&lt;BR /&gt;---- AppID DB LOAD:&lt;BR /&gt;Application Categories loading...&lt;BR /&gt;Application Categories loaded&lt;/P&gt;
&lt;P&gt;Exception in thread "main" java.util.NoSuchElementException: next on empty iterator&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)&lt;BR /&gt;at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:63)&lt;BR /&gt;at scala.collection.IterableLike$class.head(IterableLike.scala:107)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.scala$collection$IndexedSeqOptimized$$super$head(ArrayOps.scala:234)&lt;BR /&gt;at scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.scala:126)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.head(ArrayOps.scala:234)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter$.main(LogCollectorCompacter.scala:441)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter.main(LogCollectorCompacter.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;SLF4J: Class path contains multiple SLF4J bindings.&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/extraLibraries/slf4j-nop-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.&lt;BR /&gt;SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]&lt;BR /&gt;---- CREATING SPARK Session:&lt;BR /&gt;warehouseLocation:/work/Parquet/tempdata/spark-warehouse&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;|fwSerial|panosver|csvpath|size|afterProcess|&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;/P&gt;
&lt;P&gt;Memory: 12837m&lt;BR /&gt;LogCollector&amp;amp;Compacter called with the following parameters:&lt;BR /&gt;Parameters for execution&lt;BR /&gt;Master[processes]:............ local[6]&lt;BR /&gt;Available RAM (MB):........... 13145088&lt;BR /&gt;User:......................... N62231&lt;BR /&gt;debug:........................ false&lt;BR /&gt;Parameters for Job Connections&lt;BR /&gt;Task ID:...................... 1515&lt;BR /&gt;My IP:........................ 172.16.201.223&lt;BR /&gt;Expedition IP:................ 172.16.201.223:3306&lt;BR /&gt;Time Zone:.................... Europe/Helsinki&lt;BR /&gt;dbUser (dbPassword):.......... root (************)&lt;BR /&gt;projectName:.................. demo&lt;BR /&gt;Parameters for Data Sources&lt;BR /&gt;App Categories (source):........ (Expedition)&lt;BR /&gt;CSV Files Path:................./tmp/1573807129_traffic_files.csv&lt;BR /&gt;Parquet output path:.......... file:///work/Parquet/connections.parquet&lt;BR /&gt;Temporary folder:............. /work/Parquet/tempdata&lt;BR /&gt;---- AppID DB LOAD:&lt;BR /&gt;Application Categories loading...&lt;BR /&gt;Application Categories loaded&lt;/P&gt;
&lt;P&gt;Exception in thread "main" java.util.NoSuchElementException: next on empty iterator&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)&lt;BR /&gt;at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:63)&lt;BR /&gt;at scala.collection.IterableLike$class.head(IterableLike.scala:107)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.scala$collection$IndexedSeqOptimized$$super$head(ArrayOps.scala:234)&lt;BR /&gt;at scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.scala:126)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.head(ArrayOps.scala:234)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter$.main(LogCollectorCompacter.scala:441)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter.main(LogCollectorCompacter.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;/P&gt;</description>
    <pubDate>Fri, 15 Nov 2019 08:57:44 GMT</pubDate>
    <dc:creator>NecipCebeci</dc:creator>
    <dc:date>2019-11-15T08:57:44Z</dc:date>
    <item>
      <title>ML Jobs failing on 1.1.48.2</title>
      <link>https://live.paloaltonetworks.com/t5/expedition-discussions/ml-jobs-failing-on-1-1-48-2/m-p/298773#M2137</link>
      <description>&lt;P&gt;Hello all&lt;/P&gt;
&lt;P&gt;Our Expedition instance failed to compress ML logs after processing and ran out of storage. I noticed this too late,&amp;nbsp; after couple days of it not working.&lt;/P&gt;
&lt;P&gt;I cleared out space by deleting old&amp;nbsp; compressed files and manually gzip'd processed but not compressed old csv files.&lt;/P&gt;
&lt;P&gt;Then I upgraded to latest version 1.1.48.2.&lt;/P&gt;
&lt;P&gt;When try to process files now it fails instantly, I'm sharing the content of the /tmp/error_logCoCo file&lt;/P&gt;
&lt;P&gt;Any ideas what could be wrong?&lt;/P&gt;
&lt;P&gt;Thanks&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;root@Expedition:/storage/ExpeditionLogs# cat /tmp/error_logCoCo&lt;BR /&gt;SLF4J: Class path contains multiple SLF4J bindings.&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/extraLibraries/slf4j-nop-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.&lt;BR /&gt;SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]&lt;BR /&gt;---- CREATING SPARK Session:&lt;BR /&gt;warehouseLocation:/work/Parquet/tempdata/spark-warehouse&lt;BR /&gt;SLF4J: Class path contains multiple SLF4J bindings.&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/extraLibraries/slf4j-nop-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.&lt;BR /&gt;SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]&lt;BR /&gt;---- CREATING SPARK Session:&lt;BR /&gt;warehouseLocation:/work/Parquet/tempdata/spark-warehouse&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;|fwSerial|panosver|csvpath|size|afterProcess|&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;/P&gt;
&lt;P&gt;Memory: 12837m&lt;BR /&gt;LogCollector&amp;amp;Compacter called with the following parameters:&lt;BR /&gt;Parameters for execution&lt;BR /&gt;Master[processes]:............ local[6]&lt;BR /&gt;Available RAM (MB):........... 13145088&lt;BR /&gt;User:......................... N62231&lt;BR /&gt;debug:........................ false&lt;BR /&gt;Parameters for Job Connections&lt;BR /&gt;Task ID:...................... 1510&lt;BR /&gt;My IP:........................ 172.16.201.223&lt;BR /&gt;Expedition IP:................ 172.16.201.223:3306&lt;BR /&gt;Time Zone:.................... Europe/Helsinki&lt;BR /&gt;dbUser (dbPassword):.......... root (************)&lt;BR /&gt;projectName:.................. demo&lt;BR /&gt;Parameters for Data Sources&lt;BR /&gt;App Categories (source):........ (Expedition)&lt;BR /&gt;CSV Files Path:................./tmp/1573805842_traffic_files.csv&lt;BR /&gt;Parquet output path:.......... file:///work/Parquet/connections.parquet&lt;BR /&gt;Temporary folder:............. /work/Parquet/tempdata&lt;BR /&gt;---- AppID DB LOAD:&lt;BR /&gt;Application Categories loading...&lt;BR /&gt;Application Categories loaded&lt;/P&gt;
&lt;P&gt;Exception in thread "main" java.util.NoSuchElementException: next on empty iterator&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)&lt;BR /&gt;at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:63)&lt;BR /&gt;at scala.collection.IterableLike$class.head(IterableLike.scala:107)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.scala$collection$IndexedSeqOptimized$$super$head(ArrayOps.scala:234)&lt;BR /&gt;at scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.scala:126)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.head(ArrayOps.scala:234)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter$.main(LogCollectorCompacter.scala:441)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter.main(LogCollectorCompacter.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;|fwSerial|panosver|csvpath|size|afterProcess|&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;/P&gt;
&lt;P&gt;Memory: 12837m&lt;BR /&gt;LogCollector&amp;amp;Compacter called with the following parameters:&lt;BR /&gt;Parameters for execution&lt;BR /&gt;Master[processes]:............ local[6]&lt;BR /&gt;Available RAM (MB):........... 13145088&lt;BR /&gt;User:......................... N62231&lt;BR /&gt;debug:........................ false&lt;BR /&gt;Parameters for Job Connections&lt;BR /&gt;Task ID:...................... 1511&lt;BR /&gt;My IP:........................ 172.16.201.223&lt;BR /&gt;Expedition IP:................ 172.16.201.223:3306&lt;BR /&gt;Time Zone:.................... Europe/Helsinki&lt;BR /&gt;dbUser (dbPassword):.......... root (************)&lt;BR /&gt;projectName:.................. demo&lt;BR /&gt;Parameters for Data Sources&lt;BR /&gt;App Categories (source):........ (Expedition)&lt;BR /&gt;CSV Files Path:................./tmp/1573805850_traffic_files.csv&lt;BR /&gt;Parquet output path:.......... file:///work/Parquet/connections.parquet&lt;BR /&gt;Temporary folder:............. /work/Parquet/tempdata&lt;BR /&gt;---- AppID DB LOAD:&lt;BR /&gt;Application Categories loading...&lt;BR /&gt;Application Categories loaded&lt;/P&gt;
&lt;P&gt;Exception in thread "main" java.util.NoSuchElementException: next on empty iterator&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)&lt;BR /&gt;at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:63)&lt;BR /&gt;at scala.collection.IterableLike$class.head(IterableLike.scala:107)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.scala$collection$IndexedSeqOptimized$$super$head(ArrayOps.scala:234)&lt;BR /&gt;at scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.scala:126)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.head(ArrayOps.scala:234)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter$.main(LogCollectorCompacter.scala:441)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter.main(LogCollectorCompacter.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;SLF4J: Class path contains multiple SLF4J bindings.&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/extraLibraries/slf4j-nop-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.&lt;BR /&gt;SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]&lt;BR /&gt;---- CREATING SPARK Session:&lt;BR /&gt;warehouseLocation:/work/Parquet/tempdata/spark-warehouse&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;|fwSerial|panosver|csvpath|size|afterProcess|&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;/P&gt;
&lt;P&gt;Memory: 12837m&lt;BR /&gt;LogCollector&amp;amp;Compacter called with the following parameters:&lt;BR /&gt;Parameters for execution&lt;BR /&gt;Master[processes]:............ local[6]&lt;BR /&gt;Available RAM (MB):........... 13145088&lt;BR /&gt;User:......................... N62231&lt;BR /&gt;debug:........................ false&lt;BR /&gt;Parameters for Job Connections&lt;BR /&gt;Task ID:...................... 1512&lt;BR /&gt;My IP:........................ 172.16.201.223&lt;BR /&gt;Expedition IP:................ 172.16.201.223:3306&lt;BR /&gt;Time Zone:.................... Europe/Helsinki&lt;BR /&gt;dbUser (dbPassword):.......... root (************)&lt;BR /&gt;projectName:.................. demo&lt;BR /&gt;Parameters for Data Sources&lt;BR /&gt;App Categories (source):........ (Expedition)&lt;BR /&gt;CSV Files Path:................./tmp/1573806573_traffic_files.csv&lt;BR /&gt;Parquet output path:.......... file:///work/Parquet/connections.parquet&lt;BR /&gt;Temporary folder:............. /work/Parquet/tempdata&lt;BR /&gt;---- AppID DB LOAD:&lt;BR /&gt;Application Categories loading...&lt;BR /&gt;Application Categories loaded&lt;/P&gt;
&lt;P&gt;Exception in thread "main" java.util.NoSuchElementException: next on empty iterator&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)&lt;BR /&gt;at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:63)&lt;BR /&gt;at scala.collection.IterableLike$class.head(IterableLike.scala:107)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.scala$collection$IndexedSeqOptimized$$super$head(ArrayOps.scala:234)&lt;BR /&gt;at scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.scala:126)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.head(ArrayOps.scala:234)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter$.main(LogCollectorCompacter.scala:441)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter.main(LogCollectorCompacter.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;SLF4J: Class path contains multiple SLF4J bindings.&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/extraLibraries/slf4j-nop-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.&lt;BR /&gt;SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]&lt;BR /&gt;---- CREATING SPARK Session:&lt;BR /&gt;warehouseLocation:/work/Parquet/tempdata/spark-warehouse&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;|fwSerial|panosver|csvpath|size|afterProcess|&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;/P&gt;
&lt;P&gt;Memory: 12837m&lt;BR /&gt;LogCollector&amp;amp;Compacter called with the following parameters:&lt;BR /&gt;Parameters for execution&lt;BR /&gt;Master[processes]:............ local[6]&lt;BR /&gt;Available RAM (MB):........... 13145088&lt;BR /&gt;User:......................... N62231&lt;BR /&gt;debug:........................ false&lt;BR /&gt;Parameters for Job Connections&lt;BR /&gt;Task ID:...................... 1513&lt;BR /&gt;My IP:........................ 172.16.201.223&lt;BR /&gt;Expedition IP:................ 172.16.201.223:3306&lt;BR /&gt;Time Zone:.................... Europe/Helsinki&lt;BR /&gt;dbUser (dbPassword):.......... root (************)&lt;BR /&gt;projectName:.................. demo&lt;BR /&gt;Parameters for Data Sources&lt;BR /&gt;App Categories (source):........ (Expedition)&lt;BR /&gt;CSV Files Path:................./tmp/1573806922_traffic_files.csv&lt;BR /&gt;Parquet output path:.......... file:///work/Parquet/connections.parquet&lt;BR /&gt;Temporary folder:............. /work/Parquet/tempdata&lt;BR /&gt;---- AppID DB LOAD:&lt;BR /&gt;Application Categories loading...&lt;BR /&gt;Application Categories loaded&lt;/P&gt;
&lt;P&gt;Exception in thread "main" java.util.NoSuchElementException: next on empty iterator&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)&lt;BR /&gt;at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:63)&lt;BR /&gt;at scala.collection.IterableLike$class.head(IterableLike.scala:107)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.scala$collection$IndexedSeqOptimized$$super$head(ArrayOps.scala:234)&lt;BR /&gt;at scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.scala:126)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.head(ArrayOps.scala:234)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter$.main(LogCollectorCompacter.scala:441)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter.main(LogCollectorCompacter.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;SLF4J: Class path contains multiple SLF4J bindings.&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/extraLibraries/slf4j-nop-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.&lt;BR /&gt;SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]&lt;BR /&gt;---- CREATING SPARK Session:&lt;BR /&gt;warehouseLocation:/work/Parquet/tempdata/spark-warehouse&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;|fwSerial|panosver|csvpath|size|afterProcess|&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;/P&gt;
&lt;P&gt;Memory: 12837m&lt;BR /&gt;LogCollector&amp;amp;Compacter called with the following parameters:&lt;BR /&gt;Parameters for execution&lt;BR /&gt;Master[processes]:............ local[6]&lt;BR /&gt;Available RAM (MB):........... 13145088&lt;BR /&gt;User:......................... N62231&lt;BR /&gt;debug:........................ false&lt;BR /&gt;Parameters for Job Connections&lt;BR /&gt;Task ID:...................... 1514&lt;BR /&gt;My IP:........................ 172.16.201.223&lt;BR /&gt;Expedition IP:................ 172.16.201.223:3306&lt;BR /&gt;Time Zone:.................... Europe/Helsinki&lt;BR /&gt;dbUser (dbPassword):.......... root (************)&lt;BR /&gt;projectName:.................. demo&lt;BR /&gt;Parameters for Data Sources&lt;BR /&gt;App Categories (source):........ (Expedition)&lt;BR /&gt;CSV Files Path:................./tmp/1573806951_traffic_files.csv&lt;BR /&gt;Parquet output path:.......... file:///work/Parquet/connections.parquet&lt;BR /&gt;Temporary folder:............. /work/Parquet/tempdata&lt;BR /&gt;---- AppID DB LOAD:&lt;BR /&gt;Application Categories loading...&lt;BR /&gt;Application Categories loaded&lt;/P&gt;
&lt;P&gt;Exception in thread "main" java.util.NoSuchElementException: next on empty iterator&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)&lt;BR /&gt;at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:63)&lt;BR /&gt;at scala.collection.IterableLike$class.head(IterableLike.scala:107)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.scala$collection$IndexedSeqOptimized$$super$head(ArrayOps.scala:234)&lt;BR /&gt;at scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.scala:126)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.head(ArrayOps.scala:234)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter$.main(LogCollectorCompacter.scala:441)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter.main(LogCollectorCompacter.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;SLF4J: Class path contains multiple SLF4J bindings.&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/extraLibraries/slf4j-nop-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: Found binding in [jar:file:/opt/Spark/spark-2.4.3-bin-hadoop2.7/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.&lt;BR /&gt;SLF4J: Actual binding is of type [org.slf4j.helpers.NOPLoggerFactory]&lt;BR /&gt;---- CREATING SPARK Session:&lt;BR /&gt;warehouseLocation:/work/Parquet/tempdata/spark-warehouse&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;|fwSerial|panosver|csvpath|size|afterProcess|&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;BR /&gt;+--------+--------+-------+----+------------+&lt;/P&gt;
&lt;P&gt;Memory: 12837m&lt;BR /&gt;LogCollector&amp;amp;Compacter called with the following parameters:&lt;BR /&gt;Parameters for execution&lt;BR /&gt;Master[processes]:............ local[6]&lt;BR /&gt;Available RAM (MB):........... 13145088&lt;BR /&gt;User:......................... N62231&lt;BR /&gt;debug:........................ false&lt;BR /&gt;Parameters for Job Connections&lt;BR /&gt;Task ID:...................... 1515&lt;BR /&gt;My IP:........................ 172.16.201.223&lt;BR /&gt;Expedition IP:................ 172.16.201.223:3306&lt;BR /&gt;Time Zone:.................... Europe/Helsinki&lt;BR /&gt;dbUser (dbPassword):.......... root (************)&lt;BR /&gt;projectName:.................. demo&lt;BR /&gt;Parameters for Data Sources&lt;BR /&gt;App Categories (source):........ (Expedition)&lt;BR /&gt;CSV Files Path:................./tmp/1573807129_traffic_files.csv&lt;BR /&gt;Parquet output path:.......... file:///work/Parquet/connections.parquet&lt;BR /&gt;Temporary folder:............. /work/Parquet/tempdata&lt;BR /&gt;---- AppID DB LOAD:&lt;BR /&gt;Application Categories loading...&lt;BR /&gt;Application Categories loaded&lt;/P&gt;
&lt;P&gt;Exception in thread "main" java.util.NoSuchElementException: next on empty iterator&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)&lt;BR /&gt;at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)&lt;BR /&gt;at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:63)&lt;BR /&gt;at scala.collection.IterableLike$class.head(IterableLike.scala:107)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.scala$collection$IndexedSeqOptimized$$super$head(ArrayOps.scala:234)&lt;BR /&gt;at scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.scala:126)&lt;BR /&gt;at scala.collection.mutable.ArrayOps$ofInt.head(ArrayOps.scala:234)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter$.main(LogCollectorCompacter.scala:441)&lt;BR /&gt;at com.paloaltonetworks.tbd.LogCollectorCompacter.main(LogCollectorCompacter.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;/P&gt;</description>
      <pubDate>Fri, 15 Nov 2019 08:57:44 GMT</pubDate>
      <guid>https://live.paloaltonetworks.com/t5/expedition-discussions/ml-jobs-failing-on-1-1-48-2/m-p/298773#M2137</guid>
      <dc:creator>NecipCebeci</dc:creator>
      <dc:date>2019-11-15T08:57:44Z</dc:date>
    </item>
    <item>
      <title>Re: ML Jobs failing on 1.1.48.2</title>
      <link>https://live.paloaltonetworks.com/t5/expedition-discussions/ml-jobs-failing-on-1-1-48-2/m-p/299435#M2144</link>
      <description>&lt;P&gt;Thanks for reporting.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Working on it.&lt;/P&gt;</description>
      <pubDate>Tue, 19 Nov 2019 09:05:02 GMT</pubDate>
      <guid>https://live.paloaltonetworks.com/t5/expedition-discussions/ml-jobs-failing-on-1-1-48-2/m-p/299435#M2144</guid>
      <dc:creator>dgildelaig</dc:creator>
      <dc:date>2019-11-19T09:05:02Z</dc:date>
    </item>
  </channel>
</rss>

