Docker Container for Expedition

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Announcements
Please sign in to see details of an important advisory in our Customer Advisories area.

Docker Container for Expedition

L1 Bithead

I created a Docker container for the Palo Alto Expedition tool as of version 1.1.38 and published it to Docker Hub. I rebuilt it on Alpine Linux and stripped the binaries, reducing the image size to a mere 1.43 GB. Optionally, you can make the database persistent by binding a directory on your host machine to /var/lib/mysql. Likewise you can bind a directory to /data for data persistence. I made a tweak to the MySQL config to store the innodb temp file to /tmp within the container, which allows the container to support database persistence on Mac OS (and presumably Windows) in addition to Linux. This was necessary due to an issue with the Alpine host on Mac OS/Windows using a ZFS backend and MariaDB being incompatible https://jira.mariadb.org/browse/MDEV-16015

 

Docker Hub Repo:

https://hub.docker.com/r/jlegarreta/expedition

 

GitHub Repo (Docker source):
https://github.com/jlegarreta/expedition

 

Among other things, these are some of the dashboard errors I fixed:

- Remediated the "log_bin flag in MariaDB is set to off" issue by turning it on in the MySQL config

- Remediated the DBSQL_LOG_BIN value issue by setting it to 0 in /home/userSpace/userDefinitions.php

 

Assuming you have sufficient disk space, your dashboard should be all green out of the box 🙂

 

Enjoy!

20 REPLIES 20

L2 Linker

Current file...

version: '3'

services:
expedition:
image: jlegarreta/expedition:latest
restart: always
volumes:
- ./data:/data
- ./db:/var/lib/mysql
ports:
- 8006:80
- 8106:443
- 8107:4050

 

L1 Bithead

L2 Linker

Hi,

 

We applied the recommended command. After that the container starts correctly, but...

If we stop it, the container cannot start once again...

 

# cat docker-compose.yml
version: '3'

services:
expedition:
image: jlegarreta/expedition:latest
restart: always
volumes:
- ./data:/data
- ./db:/var/lib/mysql
ports:
- 8006:80
- 8106:443
- 8107:4050
cap_add:
- SYSLOG

 

Starting all services for expedition...
* Starting rsyslogd...
* Starting enhanced syslogd rsyslogd
...done.
* Starting sshd...
* Starting OpenBSD Secure Shell server sshd
...done.
* Starting rabbitmq...
* Starting RabbitMQ Messaging Server rabbitmq-server
...done.
* Starting mysql...
* Starting MariaDB database server mysqld
...fail!

 

We are a little bit lost...

 

Regards,

 

HA

 

L1 Bithead

YeungHing_0-1652409874980.png

 

I get this error when I upload a 15M config file. 
I want to modify the php.ini. Yet, is the file in the docker container which I can modified? 
Please advise what can I do to fix it 

 

@YeungHing You will need to login to the CLI of the VM, Please refer below article for how to modify the file:

 

https://live.paloaltonetworks.com/t5/expedition-discussions/how-to-upload-configuration-files-bigger...

 

 

L1 Bithead

hello, I tried updating this docker with the recent Expedition hotfix 1.2.57 using apt-get install expedition-beta and then commit the image. I followed this guide: https://phoenixnap.com/kb/how-to-commit-changes-to-docker-image

Now the newly created image won't run, just exits right away.

I'm not a docker expert or anything and not sure where to troubleshoot this.

Is there any change you're planning to release a new docker image for Expedition 1.2.57 ?

many thanks

  • 15585 Views
  • 20 replies
  • 7 Likes
Like what you see?

Show your appreciation!

Click Like if a post is helpful to you or if you just want to show your support.

Click Accept as Solution to acknowledge that the answer to your question has been provided.

The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!

These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the LIVEcommunity as a whole!

The LIVEcommunity thanks you for your participation!