Google dataflow job for log compression

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Google dataflow job for log compression

L1 Bithead

Hello,


I understand that Prisma Cloud creates and manages network, subnet, and firewall rules within the respective VPC to perform log compression. However, I'm unclear about the specific architecture and process behind the job creation and execution. It seems that the process involves associating public IP addresses and opening all ports, which is not recommended from a security perspective.

 

Is there any detailed documentation or practical guidance available beyond the existing documentation on log compression mentioned below?

https://docs.prismacloud.io/en/enterprise-edition/content-collections/connect/connect-cloud-accounts... 

Prisma Cloud 

1 REPLY 1

L2 Linker

Hello, 

Prisma Cloud's log compression for GCP leverages Google Cloud Dataflow. The process begins by enabling the Dataflow API and granting necessary permissions to the Prisma Cloud service account. This includes roles for running and examining Dataflow jobs, attaching service accounts to resources, and creating network infrastructure (network, subnetwork, and firewall rules) within your VPC. These firewall rules are automatically created by Prisma Cloud to facilitate communication between the Dataflow pipeline and compute instances handling compression. The compute instances are short-lived and created within your VPC, not externally. Therefore, no public IP addresses are directly involved in the compression process. The Dataflow jobs run within your VPC, and Prisma Cloud does not open all ports; it creates only the necessary firewall rules for the Dataflow pipeline to function. The location of your Cloud Storage bucket determines the Dataflow job's region. Prisma Cloud performs several tests before initiating compression to validate configuration and credentials. If the test job fails, it can be safely ignored. The compressed logs are saved back to your designated Cloud Storage bucket.

 

There is not any additional documentation at this time on GCP flow log compression but we can expand on this via our knowledge base. 

  • 59 Views
  • 1 replies
  • 0 Likes
Like what you see?

Show your appreciation!

Click Like if a post is helpful to you or if you just want to show your support.

Click Accept as Solution to acknowledge that the answer to your question has been provided.

The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!

These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the LIVEcommunity as a whole!

The LIVEcommunity thanks you for your participation!