<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: delete all logs from panorama in Panorama Discussions</title>
    <link>https://live.paloaltonetworks.com/t5/panorama-discussions/delete-all-logs-from-panorama/m-p/470570#M786</link>
    <description>&lt;P&gt;Thanks for the reply!&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Rebooted - same&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Upgraded to 9.1.13 - got a bit further, after sitting over night it's now stuck at 87.xxxx for&amp;nbsp;active_shards_percent_as_number.&lt;/P&gt;
&lt;P&gt;I did open a ticket with PA, they aren't sure either. We'll try to delete the logs and see if we can get the es thing back alive. It's been almost a week with no consolidated logs so we're just hoping for a fix anyway we can.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Fri, 04 Mar 2022 15:34:14 GMT</pubDate>
    <dc:creator>czane</dc:creator>
    <dc:date>2022-03-04T15:34:14Z</dc:date>
    <item>
      <title>delete all logs from panorama</title>
      <link>https://live.paloaltonetworks.com/t5/panorama-discussions/delete-all-logs-from-panorama/m-p/470315#M779</link>
      <description>&lt;P&gt;Our Panorama M600 is in a weird state with regards to logging. pushing configs to devices is just fine, but es-health is red and has been for the last few days. Thought it was rebuilding but sure looks like it's totally broken.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;We are thinking of wiping all data and starting from scratch (which is okay since we have logs on the firewalls to fall back to). Can you just delete the Managed Collector, remove all the disks, and then recreate the collector and add the disks in and things start from scratch?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Not sure if removing/adding the disks pairs in the managed collectors will remove all data or keep the data on the raid (we want to remove).&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Anyone know how to do that without having to reconfigure Pano from scratch?&lt;/P&gt;</description>
      <pubDate>Thu, 03 Mar 2022 21:05:17 GMT</pubDate>
      <guid>https://live.paloaltonetworks.com/t5/panorama-discussions/delete-all-logs-from-panorama/m-p/470315#M779</guid>
      <dc:creator>czane</dc:creator>
      <dc:date>2022-03-03T21:05:17Z</dc:date>
    </item>
    <item>
      <title>Re: delete all logs from panorama</title>
      <link>https://live.paloaltonetworks.com/t5/panorama-discussions/delete-all-logs-from-panorama/m-p/470364#M781</link>
      <description>&lt;P&gt;Thank you for the post&amp;nbsp;&lt;a href="https://live.paloaltonetworks.com/t5/user/viewprofilepage/user-id/4870"&gt;@czane&lt;/a&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;To be honest, I do not believe that red status of elastic search of log collector is a valid reason for wiping of log collector. I would reach this option only if it is unavoidable. I came across red status of elastic search issue a few times in the past. In some cases this was a bug that was resolved by PAN-OS upgrade.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;If possible could you elaborate what PAN-OS version you are running?&lt;/P&gt;
&lt;P&gt;Cold you also provide output from:&amp;nbsp;&lt;STRONG&gt;show log-collector-es-cluster&lt;/STRONG&gt; as well as:&amp;nbsp;&lt;STRONG&gt;show log-collector detail&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;BTW, have you trued to reboot Panorama?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Kind Regards&lt;/P&gt;
&lt;P&gt;Pavel&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 04 Mar 2022 01:17:26 GMT</pubDate>
      <guid>https://live.paloaltonetworks.com/t5/panorama-discussions/delete-all-logs-from-panorama/m-p/470364#M781</guid>
      <dc:creator>PavelK</dc:creator>
      <dc:date>2022-03-04T01:17:26Z</dc:date>
    </item>
    <item>
      <title>Re: delete all logs from panorama</title>
      <link>https://live.paloaltonetworks.com/t5/panorama-discussions/delete-all-logs-from-panorama/m-p/470367#M782</link>
      <description>&lt;P&gt;We're at 9.1.10 and we're rebooted once after we noticed no logs coming in. Maybe i'll try to upgrade to 9.1.13 and see what happens. can't hurt.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;admin@UH-Panorama&amp;gt; show log-collector-es-cluster health&lt;BR /&gt;&lt;BR /&gt;{&lt;BR /&gt;"cluster_name" : "__pan_cluster__",&lt;BR /&gt;"status" : "red",&lt;BR /&gt;"timed_out" : false,&lt;BR /&gt;"number_of_nodes" : 3,&lt;BR /&gt;"number_of_data_nodes" : 2,&lt;BR /&gt;"active_primary_shards" : 772,&lt;BR /&gt;"active_shards" : 774,&lt;BR /&gt;"relocating_shards" : 0,&lt;BR /&gt;"initializing_shards" : 0,&lt;BR /&gt;"unassigned_shards" : 158,&lt;BR /&gt;"delayed_unassigned_shards" : 0,&lt;BR /&gt;"number_of_pending_tasks" : 0,&lt;BR /&gt;"number_of_in_flight_fetch" : 0,&lt;BR /&gt;"task_max_waiting_in_queue_millis" : 0,&lt;BR /&gt;"active_shards_percent_as_number" : 83.04721030042919&lt;BR /&gt;}&lt;BR /&gt;&lt;BR /&gt;&lt;/PRE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;admin@UH-Panorama&amp;gt; show log-collector all&lt;BR /&gt;&lt;BR /&gt;&lt;BR /&gt;Serial CID Hostname Connected Config Status SW Version IPv4 - IPv6 &lt;BR /&gt;---------------------------------------------------------------------------------------------------------&lt;BR /&gt;[serialnum] 4 UH-Panorama yes In Sync 9.1.10 [ip address] - unknown&lt;BR /&gt;&lt;BR /&gt;Redistribution status: none&lt;BR /&gt;Last commit-all: commit succeeded, current ring version 1&lt;BR /&gt;SearchEngine status: Unknown&lt;BR /&gt;md5sum 4f5f09b388c8b735caa1b0ab4d6c543c updated at ?&lt;BR /&gt;&lt;BR /&gt;Certificate Status: &lt;BR /&gt;Certificate subject Name: &lt;BR /&gt;Certificate expiry at: none&lt;BR /&gt;Connected at: none&lt;BR /&gt;Custom certificate Used: no&lt;BR /&gt;Last masterkey push status: Unknown&lt;BR /&gt;Last masterkey push timestamp: none&lt;/PRE&gt;</description>
      <pubDate>Fri, 04 Mar 2022 01:28:03 GMT</pubDate>
      <guid>https://live.paloaltonetworks.com/t5/panorama-discussions/delete-all-logs-from-panorama/m-p/470367#M782</guid>
      <dc:creator>czane</dc:creator>
      <dc:date>2022-03-04T01:28:03Z</dc:date>
    </item>
    <item>
      <title>Re: delete all logs from panorama</title>
      <link>https://live.paloaltonetworks.com/t5/panorama-discussions/delete-all-logs-from-panorama/m-p/470377#M783</link>
      <description>&lt;P&gt;Thank you for reply&amp;nbsp;&lt;a href="https://live.paloaltonetworks.com/t5/user/viewprofilepage/user-id/4870"&gt;@czane&lt;/a&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;While I was running 9.1.10, I was hitting this bug:&amp;nbsp;&lt;SPAN&gt;PAN-166557 after I added M-600 as a new dedicated log collector. The symptom was the same, the elastic search service status was red. You might be facing different issue, but as a next step, I would recommend&amp;nbsp;to upgrade to 9.1.13.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Kind Regards&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Pavel&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 04 Mar 2022 02:02:13 GMT</pubDate>
      <guid>https://live.paloaltonetworks.com/t5/panorama-discussions/delete-all-logs-from-panorama/m-p/470377#M783</guid>
      <dc:creator>PavelK</dc:creator>
      <dc:date>2022-03-04T02:02:13Z</dc:date>
    </item>
    <item>
      <title>Re: delete all logs from panorama</title>
      <link>https://live.paloaltonetworks.com/t5/panorama-discussions/delete-all-logs-from-panorama/m-p/470570#M786</link>
      <description>&lt;P&gt;Thanks for the reply!&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Rebooted - same&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Upgraded to 9.1.13 - got a bit further, after sitting over night it's now stuck at 87.xxxx for&amp;nbsp;active_shards_percent_as_number.&lt;/P&gt;
&lt;P&gt;I did open a ticket with PA, they aren't sure either. We'll try to delete the logs and see if we can get the es thing back alive. It's been almost a week with no consolidated logs so we're just hoping for a fix anyway we can.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 04 Mar 2022 15:34:14 GMT</pubDate>
      <guid>https://live.paloaltonetworks.com/t5/panorama-discussions/delete-all-logs-from-panorama/m-p/470570#M786</guid>
      <dc:creator>czane</dc:creator>
      <dc:date>2022-03-04T15:34:14Z</dc:date>
    </item>
  </channel>
</rss>

