Panos SDWAN commit error "Failed to merge autogenerated plugin config withtemplate config"

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Announcements
Please sign in to see details of an important advisory in our Customer Advisories area.

Panos SDWAN commit error "Failed to merge autogenerated plugin config withtemplate config"

L1 Bithead

Did basic config elements configuration and struck in following commit error, "Failed to merge autogenerated plugin config withtemplate config"

Followed the below url and checked the configd.log but didn't help.

https://knowledgebase.paloaltonetworks.com/KCSArticleDetail?id=kA14u0000004NkpCAE&lang=en_US%E2%80%A...

 

Below is the full error message from configd.log. Please help me to understand error message.

=============================================================================================================================

2023-06-23 23:20:06.805 +0530 Error: pan_get_admin_user_stat(pan_auth_admin_login_stat.c:225): Admin user "_cliuser" auth statistics file "/opt/pancfg/home/_cliuser/login_statistics.txt" doesn't exist
2023-06-23 23:20:06.805 +0530 Error: _get_admin_login_stat(pan_cfg_auth_handler.c:493): Admin user "_cliuser" has not logged in yet
Connection to Update server: updates.paloaltonetworks.com completed successfully, initiated by 192.168.29.234
gger reported op command FAILED
2023-06-23 23:20:11.448 +0530 client dagger reported op command FAILED
2023-06-23 23:20:11.644 +0530 client dagger reported op command FAILED
2023-06-23 23:20:11.835 +0530 client dagger reported op command FAILED
2023-06-23 23:20:12.031 +0530 client dagger reported op command FAILED
2023-06-23 23:20:12.224 +0530 client dagger reported op command FAILED
2023-06-23 23:20:12.421 +0530 client dagger reported op command FAILED
2023-06-23 23:20:12.619 +0530 client dagger reported op command FAILED
2023-06-23 23:20:12.811 +0530 client dagger reported op command FAILED
2023-06-23 23:20:13.001 +0530 client dagger reported op command FAILED
2023-06-23 23:20:13.191 +0530 client dagger reported op command FAILED
2023-06-23 23:20:13.381 +0530 rlog fwd: ls conn: count: 0, status count: 0
2023-06-23 23:20:13.964 +0530 client useridd reported op command FAILED
2023-06-23 23:20:14.156 +0530 client useridd reported op command FAILED
2023-06-23 23:20:14.347 +0530 client useridd reported op command FAILED
2023-06-23 23:20:14.538 +0530 client useridd reported op command FAILED
2023-06-23 23:20:14.727 +0530 client distributord reported op command FAILED
2023-06-23 23:20:14.921 +0530 client dagger reported op command FAILED
2023-06-23 23:20:15.824 +0530 client dagger reported op command FAILED
2023-06-23 23:20:16.239 +0530 client dagger reported op command FAILED
2023-06-23 23:20:16.659 +0530 client dagger reported op command FAILED
2023-06-23 23:20:16.850 +0530 client dagger reported op command FAILED
2023-06-23 23:20:17.039 +0530 client dagger reported op command FAILED
2023-06-23 23:20:17.228 +0530 client dagger reported op command FAILED
2023-06-23 23:20:17.421 +0530 client useridd reported op command FAILED
2023-06-23 23:20:17.611 +0530 client routed reported op command FAILED
2023-06-23 23:20:18.148 +0530 client useridd reported op command FAILED
2023-06-23 23:20:18.482 +0530 client dagger reported op command FAILED
2023-06-23 23:20:18.671 +0530 client dagger reported op command FAILED
2023-06-23 23:20:19.007 +0530 client rasmgr reported op command FAILED
2023-06-23 23:20:19.200 +0530 client dagger reported op command FAILED
2023-06-23 23:20:19.391 +0530 client rasmgr reported op command FAILED
2023-06-23 23:20:19.472 +0530 rlog fwd: ls conn: count: 0, status count: 0
2023-06-23 23:20:19.581 +0530 client rasmgr reported op command FAILED
2023-06-23 23:20:19.778 +0530 client rasmgr reported op command FAILED
2023-06-23 23:20:19.968 +0530 client iotd reported op command FAILED
2023-06-23 23:20:20.165 +0530 client dagger reported op command FAILED
2023-06-23 23:20:20.358 +0530 client dagger reported op command FAILED
2023-06-23 23:20:20.546 +0530 client dagger reported op command FAILED
2023-06-23 23:20:20.955 +0530 client dagger reported op command FAILED
2023-06-23 23:20:21.158 +0530 client dagger reported op command FAILED
2023-06-23 23:20:21.603 +0530 Need to fetch CAS regions since last_fetch 1687537825 + max_age 3600 < now 1687542621
2023-06-23 23:20:21.916 +0530 Error: pan_rlog_fwd_show_logging_status(pan_rlog_fwd_debug.c:2335): log fwd status: failed to fetch obj connection-status
2023-06-23 23:20:21.918 +0530 Pref file: /opt/pancfg/mgmt/global/lcs-pref.xml doesnt exist
2023-06-23 23:20:21.918 +0530 Pref file: /opt/pancfg/mgmt/global/lcs-pref.xml doesnt exist
2023-06-23 23:20:21.918 +0530 rlog fwd: ls conn: count: 0, status count: 0
2023-06-23 23:20:22.105 +0530 Error: pan_lcaas_status_handler(pan_ops_common_lcaas.c:1009): lcaas status: fetch sdb object cfg.lcaas-license failed.
2023-06-23 23:20:22.642 +0530 response code: 200
2023-06-23 23:20:22.643 +0530 Got max_age for CAS regions info as 3600 seconds
2023-06-23 23:20:23.204 +0530 client dagger reported op command FAILED
2023-06-23 23:20:23.395 +0530 client dagger reported op command FAILED
2023-06-23 23:20:23.586 +0530 client routed reported op command FAILED
2023-06-23 23:20:23.994 +0530 Error: mgmt_sysd_device_logstats_callback(pan_mgmt_sysd.c:1776): failed to get device status runtime.logging-status-list
2023-06-23 23:20:27.650 +0530 client dagger reported op command FAILED
2023-06-23 23:20:27.840 +0530 client useridd reported op command FAILED
2023-06-23 23:20:28.043 +0530 client dagger reported op command FAILED
2023-06-23 23:20:42.373 +0530 Error: pan_cfg_get_session_by_cookie(pan_cfg_mgr.c:14562): session with cookie 5766658191354486 doesn't exist2023-06-23 23:20:42.373 +0530 Error: pan_cfg_session_is_alive(pan_cfg_mgr.c:38102): cannot get socket peer address info
2023-06-23 23:20:42.373 +0530 Error: pan_cfg_session_is_alive(pan_cfg_mgr.c:38126): username missing in cms request
2023-06-23 23:20:48.931 +0530 Encrypting panorama push / diff-all config
2023-06-23 23:20:49.353 +0530 CommitAll job started processing. Dequeue time=2023/06/23 23:20:49. JobId=6807.User: Panorama-ram
2023-06-23 23:20:49.353 +0530 start pan_commit_get_cfg_root
2023-06-23 23:20:50.156 +0530 Return detail-ver 10.1.8
2023-06-23 23:20:51.241 +0530 Panorama push template TS-Spoke-440 with merge-with-candidate-cfg flags set.JobId=6807.User=Panorama-ram. Dequeue time=2023/06/23 23:20:49. TPL version: 247.
2023-06-23 23:20:51.259 +0530 start pan_cfg_save_commit_candidate
2023-06-23 23:20:51.259 +0530 cfg-version/detail-version in template push request 11.0.0/11.0.1, different from my version/detail-version 10.1.0/10.1.8 - will need to apply some transforms
2023-06-23 23:20:51.503 +0530 files name tpl-transform-11.0.1-to-11.0.0.xsl
2023-06-23 23:20:51.503 +0530 start_major 11, start sub-major 0, start minor 1, end_major 11, end sub-major 0, end minor 0
2023-06-23 23:20:51.503 +0530 migration file found is tpl-transform-11.0.1-to-11.0.0.xsl
2023-06-23 23:20:51.503 +0530 pan_cfg_apply_ver_transforms: Skipping zero size migration file tpl-transform-11.0.1-to-11.0.0.xsl
2023-06-23 23:20:51.503 +0530 Reached last minor version transform 11.0.0
2023-06-23 23:20:51.503 +0530 pan_cfg_apply_ver_transforms: Applying xform /opt/pancfg/mgmt/transforms/tpl-transform-11.0.0-to-10.2.0.xsl to pushed template
2023-06-23 23:20:51.519 +0530 pan_cfg_apply_ver_transforms: Set detail-version to 10.1.8
2023-06-23 23:20:51.520 +0530 files name tpl-transform-10.1.8-to-10.1.7.xsl
2023-06-23 23:20:51.520 +0530 files name tpl-transform-10.1.7-to-10.1.6.xsl
2023-06-23 23:20:51.520 +0530 files name tpl-transform-10.1.6-to-10.1.5.xsl
2023-06-23 23:20:51.520 +0530 files name tpl-transform-10.1.5-to-10.1.4.xsl
2023-06-23 23:20:51.520 +0530 files name tpl-transform-10.1.4-to-10.1.3.xsl
2023-06-23 23:20:51.521 +0530 files name tpl-transform-10.1.3-to-10.1.2.xsl
2023-06-23 23:20:51.521 +0530 files name tpl-transform-10.1.2-to-10.1.1.xsl
2023-06-23 23:20:51.521 +0530 files name tpl-transform-10.1.1-to-10.1.0.xsl
2023-06-23 23:20:51.521 +0530 start_major 10, start sub-major 1, start minor 8, end_major 10, end sub-major 1, end minor 7
2023-06-23 23:20:51.521 +0530 start_major 10, start sub-major 1, start minor 7, end_major 10, end sub-major 1, end minor 6
2023-06-23 23:20:51.521 +0530 start_major 10, start sub-major 1, start minor 6, end_major 10, end sub-major 1, end minor 5
2023-06-23 23:20:51.521 +0530 start_major 10, start sub-major 1, start minor 5, end_major 10, end sub-major 1, end minor 4
2023-06-23 23:20:51.521 +0530 start_major 10, start sub-major 1, start minor 4, end_major 10, end sub-major 1, end minor 3
2023-06-23 23:20:51.522 +0530 start_major 10, start sub-major 1, start minor 3, end_major 10, end sub-major 1, end minor 2
2023-06-23 23:20:51.522 +0530 start_major 10, start sub-major 1, start minor 2, end_major 10, end sub-major 1, end minor 1
2023-06-23 23:20:51.522 +0530 start_major 10, start sub-major 1, start minor 1, end_major 10, end sub-major 1, end minor 0
2023-06-23 23:20:51.522 +0530 pan_cfg_apply_ver_transforms: Applying xform /opt/pancfg/mgmt/transforms/tpl-transform-10.2.0-to-10.1.0.xsl to pushed template
2023-06-23 23:20:51.536 +0530 pan_cfg_apply_ver_transforms: Set detail-version to 10.1.8
2023-06-23 23:20:51.539 +0530 Error: pan_cfg_transform_fullpath(pan_cfg_utils.c:6829): error generating transform /opt/pancfg/mgmt/factory/tplrenamemapfrompushreq.xsl
2023-06-23 23:20:51.539 +0530 Error: pan_cfg_tpl_renamemap_from_request(pan_cfg_templates.c:4986): failed to generate tpl rename map from request
File "/opt/pancfg/mgmt/transforms/cluster-gen.py", line 2592, in <module>
if xml_to_file(gen_auto_config(root, tpl_config), xml_outfile, True) < 0:
File "/opt/pancfg/mgmt/transforms/cluster-gen.py", line 2167, in gen_auto_config
routing_profile_node = RoutingProfile(root, local_info)
File "/opt/pancfg/mgmt/transforms/cluster-gen.py", line 1199, in __init__
xml_add_children(self.routing_profile_node,self.__get_bgp_node(root, local_info, peer_info),self.__get_filters_node(root, local_info, peer_info))
File "/opt/pancfg/mgmt/transforms/cluster-gen.py", line 942, in __get_bgp_node
if local_info.bgp_info.remove_private_as == "yes":
2023-06-23 23:20:51.646 +0530 Error: pan_cfg_pushtpl_autogen_config_merge(pan_cfg_templates.c:6856): self.prisma_cluster:False
self.prisma_cluster:False
Failed to auto generate SD-wan config:
'NoneType' object has no attribute 'remove_private_as'
***Traceback***
Failed to execute cluster-gen.py
2023-06-23 23:20:51.646 +0530 Error: pan_cfg_tpl_generate_candidate(pan_cfg_templates.c:7075): Failed to merge autogenerated plugin config withtemplate config
2023-06-23 23:20:51.646 +0530 Error: pan_cfg_save_commit_candidate(pan_cfg_users.c:4078): error generating tpl candidate
2023-06-23 23:20:51.646 +0530 Error: pan_cfg_generate_commit_candidate(pan_cfg_users.c:4598): Unable to save commit candidate for device localhost.localdomain
2023-06-23 23:20:51.646 +0530 Error: pan_cfg_generate_commit_candidates(pan_cfg_users.c:4663): Unable to save commit candidate
2023-06-23 23:20:51.646 +0530 detail : Commit from Panorama. Merged with candidate config: Yes. Commit parameters: force=false, device_network=true, shared_object=true. Commit All Vsys.
'cfg.url-vendor-old': NO_MATCHES
/usr/local/bin/bin_scripts/old_url_vendor_is_pan.sh: line 2: [: !=: unary operator expected
2023-06-23 23:20:51.859 +0530 Could not find url vendor, returning paloaltonetworks as default
2023-06-23 23:20:51.935 +0530 Error: pan_cfg_get_cms_msg(pan_cfg_mgr.c:44615): SC3R: failed to get current CSR
2023-06-23 23:20:52.337 +0530 Error: pan_jobmgr_process_job(pan_job_mgr.c:3673): Failed to do pre commit processing
'cfg.url-vendor-old': NO_MATCHES
/usr/local/bin/bin_scripts/old_url_vendor_is_pan.sh: line 2: [: !=: unary operator expected
2023-06-23 23:20:52.746 +0530 Could not find url vendor, returning paloaltonetworks as default
2023-06-23 23:20:52.807 +0530 Error: pan_cfg_get_cms_msg(pan_cfg_mgr.c:44615): SC3R: failed to get current CSR

====================================================================================================

 

predefined zone configured and pushed to device and additional zone are there but decided not to use for sd wan planned interfaces.

 

1 REPLY 1

L0 Member

I am also receiving the same issue. Did you manage to fix this?

 

I'll let you know if TAC or my SE is able to fix it.

 

 

  • 1117 Views
  • 1 replies
  • 0 Likes
Like what you see?

Show your appreciation!

Click Like if a post is helpful to you or if you just want to show your support.

Click Accept as Solution to acknowledge that the answer to your question has been provided.

The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!

These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the LIVEcommunity as a whole!

The LIVEcommunity thanks you for your participation!