Commit of Transit Template fails upon full push without reason. New build

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Announcements

Commit of Transit Template fails upon full push without reason. New build

L1 Bithead

I am able to pusha  different template if I move around FW members, but not the transit stack that I build per documentation

1 accepted solution

Accepted Solutions

Cyber Elite
Cyber Elite

Hello @MKurowksi

 

did you get any error? Could you check logs from CLI to see it can give more details about failure?

 

Panorama: tail follow yes mp-log configd.log
FW: tail follow yes mp-log devsrv.log
 
Kind Regards
Pavel
Help the community: Like helpful comments and mark solutions.

View solution in original post

3 REPLIES 3

Cyber Elite
Cyber Elite

Hello @MKurowksi

 

did you get any error? Could you check logs from CLI to see it can give more details about failure?

 

Panorama: tail follow yes mp-log configd.log
FW: tail follow yes mp-log devsrv.log
 
Kind Regards
Pavel
Help the community: Like helpful comments and mark solutions.

L1 Bithead

Sure, thanks for having a look. Below are the tail logs during the push from panorama side. The gui did not provide any of that, so it appears that it is not liking the interface config, hmmm. I'll include FW side logs in the next reply


2023-07-07 08:29:15.433 -0500 Commit All job 98 scheduled
2023-07-07 08:29:15.575 -0500 decrypt tpl before pushing
2023-07-07 08:29:15.627 -0500 Panorama push template Transit with merge-with-candidate-cfg flags set.JobId=98.User=XXXXXXXXXXXX. Device count=2. Dequeue time=2023/07/07 08:29:15.
2023-07-07 08:29:15.746 -0500 decrypt tpl before pushing
2023-07-07 08:29:15.798 -0500 Panorama push template Transit with merge-with-candidate-cfg flags set.JobId=98.User=XXXXXXX. Device count=2. Dequeue time=2023/07/07 08:29:15.
2023-07-07 08:29:15.801 -0500 Updated job estimate avg time to 0 secs for job type 41
Connection to Update server: updates.paloaltonetworks.com completed successfully, initiated by 10.31.4.4
2023-07-07 08:29:20.371 -0500 logbuffer: no active connection to cms0
2023-07-07 08:29:40.372 -0500 logbuffer: no active connection to cms0
2023-07-07 08:29:55.366 -0500 Template update device status - tplname: Transit, dname: 007957000386955, jobid: 98, result: error, validate only: no, has_warnings: yes, msg: <msg cmd="push-data" dname="007957000XXXXXX" tplname="Transit" jobid="98" result="error">
<errors>
<line>Validation Error:</line>
<line><![CDATA[ network -> virtual-router -> vr-public-untrust -> interface 'ethernet1/1' is not a valid reference]]></line>
<line><![CDATA[ network -> virtual-router -> vr-public-untrust -> interface is invalid]]></line>
<line>failed to add route to vr vr-public-untrust(Module: routed)</line>
<line>client routed phase 1 failure</line>
<line>Commit failed</line>
</errors>
<warnings> </warnings>
</msg>2023-07-07 08:29:58.209 -0500 Template update device status - tplname: Transit, dname: 007957000XXXXXX, jobid: 98, result: error, validate only: no, has_warnings: yes, msg: <msg cmd="push-data" dname="007957000XXXXXX" tplname="Transit" jobid="98" result="error">
<errors>
<line>Validation Error:</line>
<line><![CDATA[ network -> virtual-router -> vr-public-untrust -> interface 'ethernet1/1' is not a valid reference]]></line>
<line><![CDATA[ network -> virtual-router -> vr-public-untrust -> interface is invalid]]></line>
<line>failed to add route to vr vr-public-untrust(Module: routed)</line>
<line>client routed phase 1 failure</line>
<line>Commit failed</line>
</errors>
<warnings> </warnings>
</msg>2023-07-07 08:29:59.891 -0500 SVM registration. Serial:007957000XXXXXX DG: TPL: vm-mode:0 uuid:19513526-08B6-7247-B3CF-44C42BAF62AE cpuid:AZR:F1060400FFFB8B1F svm_id:/proc/cpuinfo: No such file or directory
2023-07-07 08:30:00.372 -0500 logbuffer: no active connection to cms0
/proc/cpuinfo: No such file or directory
2023-07-07 08:30:03.344 -0500 SVM registration. Serial:007957000XXXXXX DG: TPL: vm-mode:0 uuid:56F22316-387D-7D42-82A6-4616A4B67597 cpuid:AZR:F1060400FFFB8B1F svm_id:/proc/cpuinfo: No such file or directory
2023-07-07 08:30:03.617 -0500 Error: pan_plugin_device_register_callback(pan_plugin_device.c:272): Event failed for plugin azure Plugin: malformed scripts/commit_added_device.py result for azure2023-07-07 08:30:03.617 -0500 processing a status message from 007957000XXXXXX
2023-07-07 08:30:03.617 -0500 pan_conn_set_conn_details: device_type:server: client is device
2023-07-07 08:30:03.618 -0500 Got HA state from device 007957000XXXXXX, local state: unknown, peer state: unknown
/proc/cpuinfo: No such file or directory
2023-07-07 08:30:07.059 -0500 Error: pan_plugin_device_register_callback(pan_plugin_device.c:272): Event failed for plugin azure Plugin: malformed scripts/commit_added_device.py result for azure2023-07-07 08:30:07.059 -0500 processing a status message from 007957000XXXXXX
2023-07-07 08:30:07.059 -0500 pan_conn_set_conn_details: device_type:server: client is device
2023-07-07 08:30:07.059 -0500 Got HA state from device 007957000XXXXXX, local state: unknown, peer state: unknown
2023-07-07 08:30:20.372 -0500 logbuffer: no active connection to cms0

 

L1 Bithead

Below are logs from the firewall, separate attempt

 

2023-07-07 06:30:05.510 -0700 start to destruct mlav_info
2023-07-07 06:30:05.510 -0700 start to destruct mlav_info 0x55bcbbfe3850
2023-07-07 06:30:05.549 -0700 Error: pan_ctrl_config_phase1(pan_controller_proc.c:1348): pan_ctrl_parse_config() failed
2023-07-07 06:30:05.549 -0700 Config commit phase1 failed
2023-07-07 06:30:05.549 -0700 Deleted alt data in redis
2023-07-07 06:30:05.549 -0700 No need to sync base ids in cfg
2023-07-07 06:30:05.550 -0700 Error: cfgagent_modify_callback(pan_cfgagent.c:98): Modify string (sw.mgmt.runtime.clients.device.err) error: USER (1)
2023-07-07 06:30:05.948 -0700 kill SIGUSR1 to pid 0
2023-07-07 06:30:05.948 -0700 Sending phase_abort to DP
2023-07-07 06:30:05.948 -0700 Phase_abort to DP done, Setting ctrl state to IDLE
2023-07-07 06:37:21.981 -0700 [CFG_OPT] Input Phase 0: cfg : 0x55bcb7ece000 len: 14192985
2023-07-07 06:37:21.981 -0700 Config commit phase0 started
2023-07-07 06:37:23.323 -0700 Updated shmgr phase0 count to 12
2023-07-07 06:37:23.330 -0700 Warning: pan_config_populate_global_app(pan_config_handler.c:951): application rmi-iiop-base has the same id 976 as rmi-iiop.
2023-07-07 06:37:23.331 -0700 Warning: pan_config_populate_global_app(pan_config_handler.c:951): application openai-base has the same id 11365 as openai.
2023-07-07 06:37:23.331 -0700 Warning: pan_config_populate_global_app(pan_config_handler.c:951): application opc-hda-base has the same id 11152 as opc-hda.
2023-07-07 06:37:23.332 -0700 Warning: pan_config_populate_global_app(pan_config_handler.c:951): application omron-fins-base has the same id 10577 as omron-fins.
2023-07-07 06:37:23.332 -0700 Warning: pan_config_populate_global_app(pan_config_handler.c:951): application ms-visual-studio-tfs-base has the same id 1439 as ms-visual-studio-tfs.
2023-07-07 06:37:23.332 -0700 Warning: pan_config_populate_global_app(pan_config_handler.c:951): application jiomeet-base has the same id 11334 as jiomeet.
2023-07-07 06:37:23.333 -0700 Warning: pan_config_populate_global_app(pan_config_handler.c:951): application igel-base has the same id 2383 as igel.
2023-07-07 06:37:23.333 -0700 Warning: pan_config_populate_global_app(pan_config_handler.c:951): application hart-ip-base has the same id 10787 as hart-ip.
2023-07-07 06:37:23.333 -0700 Warning: pan_config_populate_global_app(pan_config_handler.c:951): application ieee-1278-dis-base has the same id 11174 as ieee-1278-dis.
2023-07-07 06:37:23.334 -0700 Warning: pan_config_populate_global_app(pan_config_handler.c:951): application df1-base has the same id 11240 as df1.
2023-07-07 06:37:23.334 -0700 Warning: pan_config_populate_global_app(pan_config_handler.c:951): application cti-camp-base has the same id 11231 as cti-camp.
2023-07-07 06:37:24.263 -0700 Config commit phase0 done
2023-07-07 06:37:24.263 -0700 [CFG_OPT] Write back full cfg
2023-07-07 06:37:24.283 -0700 [CFG_OPT] Final Phase 0: cfg : 0x55bc96612000 len: 12763476
2023-07-07 06:37:39.029 -0700 [CFG_OPT] Input Phase 1: cfg : 0x55bc9f036000 len: 14012004
2023-07-07 06:37:39.029 -0700 Config commit phase1 started
2023-07-07 06:37:39.029 -0700 flags 0x40, content 0x1, not devsrvr only, not content only
2023-07-07 06:37:44.231 -0700 pan_ctrl_config_phase1: no cloud server is configured.
2023-07-07 06:37:44.269 -0700 Disable cloudapp setting is not configured
2023-07-07 06:37:44.269 -0700 Unset AddrObjRefresh Dirty: 0
2023-07-07 06:37:44.269 -0700 [CFG_OPT] Final Phase 1: cfg : 0x55bc9f036000 len: 14012004
2023-07-07 06:37:44.316 -0700 Get virus from last committed config
2023-07-07 06:37:44.316 -0700 Get wildfire from last committed config
2023-07-07 06:37:44.316 -0700 Get wpc from last committed config
2023-07-07 06:37:44.316 -0700 Get raven from last committed config
2023-07-07 06:37:44.317 -0700 TDB compilation started. tdb_compile_flag: 0x1 custom_dns 0
2023-07-07 06:37:44.317 -0700 compile type 0x1 (1)
2023-07-07 06:37:49.441 -0700 Config commit phase1 abort
2023-07-07 06:37:49.445 -0700 Error: cfgagent_modify_callback(pan_cfgagent.c:98): Modify string (sw.mgmt.runtime.clients.device.err) error: USER (1)
2023-07-07 06:37:49.509 -0700 Deleted alt data in redis
2023-07-07 06:37:49.510 -0700 No need to sync base ids in cfg
2023-07-07 06:37:49.510 -0700 tdb compile flag is still up, abort thread wait 1 second
2023-07-07 06:37:50.511 -0700 tdb compile flag is still up, abort thread wait 1 second
2023-07-07 06:37:51.194 -0700 Response received: load config from file ...
IP127.0.0.1
DNS_PRIMARY127.0.0.1
DNS_PORT_PRIMARY1053
CONNECT_TIMEOUT5s
RATE_PER_100MS400
...

opts :
[('-r', ''), ('-a', '')]

args :
[]

config ...
IP : 127.0.0.1
DNS_PRIMARY : 127.0.0.1
DNS_PORT_PRIMARY : 1053
CONNECT_TIMEOUT : 5s
RATE_PER_100MS : 0
EP ENABLE : True
TP ENABLE : False
SNI_CHECK : False
PROXY_AUTH : False
SAML : False
...

No need to remove: /etc/pan_auth.conf does not exist
Envoy proxy authentication is not configured
Envoy Notifier:DONE!

Envoy config successfully reloaded

2023-07-07 06:37:51.511 -0700 tdb compile flag is still up, abort thread wait 1 second
2023-07-07 06:37:52.511 -0700 tdb compile flag is still up, abort thread wait 1 second
2023-07-07 06:37:53.511 -0700 tdb compile flag is still up, abort thread wait 1 second
2023-07-07 06:37:54.511 -0700 tdb compile flag is still up, abort thread wait 1 second
2023-07-07 06:37:55.431 -0700 Content Engine version: 0xb000000 APP version: 0x21c41e20, Threat 0x21c41e20, virus 0x0, wildfire 0x0 type 1
2023-07-07 06:37:55.511 -0700 tdb compile flag is still up, abort thread wait 1 second
2023-07-07 06:37:55.538 -0700 Primary checking
2023-07-07 06:37:55.899 -0700 [Cache] Load /opt/pancfg/mgmt/content//cache/b0000//tdb.cache.ser-0-hs0-0 success
2023-07-07 06:37:55.906 -0700 Primary checks done
2023-07-07 06:37:55.906 -0700 [TDB] Loading tdb cache /opt/pancfg/mgmt/content//cache/b0000//tdb.cache.ser-0-hs0-0 with wildfire 0/0 virus 0/0
2023-07-07 06:37:55.906 -0700 calc md5
2023-07-07 06:37:56.511 -0700 tdb compile flag is still up, abort thread wait 1 second
adding ottawa name rmi-iiop to id 12900
adding ottawa name openai to id 12901
adding ottawa name opc-hda to id 12902
adding ottawa name omron-fins to id 12903
adding ottawa name ms-visual-studio-tfs to id 12904
adding ottawa name jiomeet to id 12905
adding ottawa name igel to id 12906
adding ottawa name hart-ip to id 12907
adding ottawa name ieee-1278-dis to id 12908
adding ottawa name df1 to id 12909
adding ottawa name cti-camp to id 12910
2023-07-07 06:37:57.512 -0700 tdb compile flag is still up, abort thread wait 1 second
2023-07-07 06:37:58.512 -0700 tdb compile flag is still up, abort thread wait 1 second
2023-07-07 06:37:59.512 -0700 tdb compile flag is still up, abort thread wait 1 second
2023-07-07 06:38:00.512 -0700 tdb compile flag is still up, abort thread wait 1 second
2023-07-07 06:38:01.512 -0700 tdb compile flag is still up, abort thread wait 1 second
2023-07-07 06:38:01.627 -0700 End of parsing custom threat
2023-07-07 06:38:01.907 -0700 pan_tdb_unserialize: mlav size 105456
2023-07-07 06:38:01.908 -0700 [mlav] unserialize start
2023-07-07 06:38:01.908 -0700 [mlav] unserialized timeouts benign: 300 malware: 200 unknown: 60
2023-07-07 06:38:01.908 -0700 [mlav] unserialize 10 control vectors
2023-07-07 06:38:01.909 -0700 [mlav] unserialize got 950 ngrams
2023-07-07 06:38:01.909 -0700 [mlav] unserialize got 36 scales
2023-07-07 06:38:01.909 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.909 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.909 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.910 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.910 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.910 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.910 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.910 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.910 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.911 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.911 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.911 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.911 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.911 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.911 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.912 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.912 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.912 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.912 -0700 [mlav] unserialize got scale[65793.000000]
2023-07-07 06:38:01.912 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.912 -0700 [mlav] unserialize got scale[65793.000000]
2023-07-07 06:38:01.912 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.912 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.912 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.913 -0700 [mlav] unserialize got scale[65793.000000]
2023-07-07 06:38:01.913 -0700 [mlav] unserialize got scale[65793.000000]
2023-07-07 06:38:01.913 -0700 [mlav] unserialize got scale[16777216.000000]
2023-07-07 06:38:01.913 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.913 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.913 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.914 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.914 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.914 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.914 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.914 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.914 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.914 -0700 [mlav] unserialize got 246 ngrams
2023-07-07 06:38:01.915 -0700 [mlav] unserialize got 0 scales
2023-07-07 06:38:01.915 -0700 [mlav] unserialize got 0 ngrams
2023-07-07 06:38:01.915 -0700 [mlav] unserialize got 9 scales
2023-07-07 06:38:01.915 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.915 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.915 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.915 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.916 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.916 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.916 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.916 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.916 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.916 -0700 [mlav] unserialize got 1000 ngrams
2023-07-07 06:38:01.916 -0700 [mlav] unserialize got 0 scales
2023-07-07 06:38:01.916 -0700 [mlav] unserialize got 997 ngrams
2023-07-07 06:38:01.961 -0700 [mlav] unserialize got 0 scales
2023-07-07 06:38:01.961 -0700 [mlav] unserialize got 950 ngrams
2023-07-07 06:38:01.961 -0700 [mlav] unserialize got 0 scales
2023-07-07 06:38:01.961 -0700 [mlav] unserialize got 977 ngrams
2023-07-07 06:38:01.961 -0700 [mlav] unserialize got 0 scales
2023-07-07 06:38:01.961 -0700 [mlav] unserialize got 0 ngrams
2023-07-07 06:38:01.962 -0700 [mlav] unserialize got 9 scales
2023-07-07 06:38:01.962 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.962 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.962 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.962 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.962 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.962 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.962 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.963 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.963 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.963 -0700 [mlav] unserialize got 0 ngrams
2023-07-07 06:38:01.963 -0700 [mlav] unserialize got 9 scales
2023-07-07 06:38:01.963 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.963 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.963 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.964 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.964 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.964 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.964 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.964 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.964 -0700 [mlav] unserialize got scale[1.000000]
2023-07-07 06:38:01.964 -0700 [mlav] unserialize got 1885 ngrams
2023-07-07 06:38:01.964 -0700 [mlav] unserialize got 0 scales
2023-07-07 06:38:01.965 -0700 [mlav] unserialize got 4162 bytes tld hash
2023-07-07 06:38:01.965 -0700 [mlav] unserialize got 1228 bytes cctld hash
2023-07-07 06:38:01.965 -0700 [mlav] unserialize end
2023-07-07 06:38:01.965 -0700 pan_tdb_unserialize: pfr size 6820
2023-07-07 06:38:01.965 -0700 [pfr] unserialize start
2023-07-07 06:38:01.965 -0700 [pfr] unserialize end with size 6820
2023-07-07 06:38:02.512 -0700 tdb compile flag is still up, abort thread wait 1 second
2023-07-07 06:38:03.123 -0700 [Cache] Load /opt/pancfg/mgmt/content//cache/b0000//tdb.cache.ser-0-hs0-0 success
load cache is successful
2023-07-07 06:38:03.126 -0700 [cloudapp]:update s_tdb to s_cloudapp_tdb
2023-07-07 06:38:03.132 -0700 cloudapp: got strmap 0x55bc98bd7200
2023-07-07 06:38:03.132 -0700 copy 5537 fields from s_tdb to s_cloudapp_tdb
2023-07-07 06:38:03.152 -0700 [cloudapp] update sml for appid[2]
2023-07-07 06:38:03.156 -0700 [cloudapp] update sml for appid[15]
2023-07-07 06:38:03.157 -0700 Warning: pan_ctrl_compile_tdb(pan_config_handler_sysd.c:603): config commit aborted
2023-07-07 06:38:03.157 -0700 TDB compilation done, return 0
2023-07-07 06:38:03.165 -0700 Warning: pan_ctrl_save_config(pan_config_handler_sysd.c:2201): config commit aborted
2023-07-07 06:38:03.166 -0700 Error: pan_ctrl_compile_cfg(pan_config_handler_sysd.c:2476): pan_ctrl_save_config() failed
2023-07-07 06:38:03.166 -0700 Error: pan_config_handler_sysd(pan_config_handler_sysd.c:2890): pan_ctrl_compile_cfg() failed
2023-07-07 06:38:03.166 -0700 Error: pan_ctrl_parse_config(pan_controller_proc.c:755): pan_config_handler_sysd() failed
2023-07-07 06:38:03.166 -0700 start to destruct s_tdb
2023-07-07 06:38:03.217 -0700 start to destruct mlav_info
2023-07-07 06:38:03.217 -0700 start to destruct mlav_info 0x55bcbba2a690
2023-07-07 06:38:03.260 -0700 Error: pan_ctrl_config_phase1(pan_controller_proc.c:1348): pan_ctrl_parse_config() failed
2023-07-07 06:38:03.261 -0700 Config commit phase1 failed
2023-07-07 06:38:03.261 -0700 Deleted alt data in redis
2023-07-07 06:38:03.262 -0700 No need to sync base ids in cfg
2023-07-07 06:38:03.263 -0700 Error: cfgagent_modify_callback(pan_cfgagent.c:98): Modify string (sw.mgmt.runtime.clients.device.err) error: USER (1)
2023-07-07 06:38:03.512 -0700 kill SIGUSR1 to pid 0
2023-07-07 06:38:03.512 -0700 Sending phase_abort to DP
2023-07-07 06:38:03.512 -0700 Phase_abort to DP done, Setting ctrl state to IDLE

 

  • 1 accepted solution
  • 1624 Views
  • 3 replies
  • 0 Likes
Like what you see?

Show your appreciation!

Click Like if a post is helpful to you or if you just want to show your support.

Click Accept as Solution to acknowledge that the answer to your question has been provided.

The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!

These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the LIVEcommunity as a whole!

The LIVEcommunity thanks you for your participation!