Google dataflow job for log compression

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Google dataflow job for log compression

L1 Bithead

Hello,


I understand that Prisma Cloud creates and manages network, subnet, and firewall rules within the respective VPC to perform log compression. However, I'm unclear about the specific architecture and process behind the job creation and execution. It seems that the process involves associating public IP addresses and opening all ports, which is not recommended from a security perspective.

 

Is there any detailed documentation or practical guidance available beyond the existing documentation on log compression mentioned below?

https://docs.prismacloud.io/en/enterprise-edition/content-collections/connect/connect-cloud-accounts... 

Prisma Cloud 

4 REPLIES 4

L2 Linker

Hello, 

Prisma Cloud's log compression for GCP leverages Google Cloud Dataflow. The process begins by enabling the Dataflow API and granting necessary permissions to the Prisma Cloud service account. This includes roles for running and examining Dataflow jobs, attaching service accounts to resources, and creating network infrastructure (network, subnetwork, and firewall rules) within your VPC. These firewall rules are automatically created by Prisma Cloud to facilitate communication between the Dataflow pipeline and compute instances handling compression. The compute instances are short-lived and created within your VPC, not externally. Therefore, no public IP addresses are directly involved in the compression process. The Dataflow jobs run within your VPC, and Prisma Cloud does not open all ports; it creates only the necessary firewall rules for the Dataflow pipeline to function. The location of your Cloud Storage bucket determines the Dataflow job's region. Prisma Cloud performs several tests before initiating compression to validate configuration and credentials. If the test job fails, it can be safely ignored. The compressed logs are saved back to your designated Cloud Storage bucket.

 

There is not any additional documentation at this time on GCP flow log compression but we can expand on this via our knowledge base. 

Hi,

In my scenario it has automatically assigned public IP, and all port are showing open also subnet flow logs are off and private google is also off in that dataflow worker nodes in my environments.
What should I do in this scenario??
and please share if we have any knowledge base documentation links!

Hello, 

To remediate the security risks associated with your Google Cloud Platform (GCP) Dataflow environment, follow these steps:

1. Secure Dataflow Worker Nodes:

  • Change IP Configuration: Ensure your Dataflow jobs are configured to use private IP addresses for worker nodes. This reduces your attack surface by preventing direct public internet access. Refer to the GCP documentation for configuring private IP addresses for Dataflow jobs.
  • Restrict Port Access: Minimize open ports on your worker nodes. Only allow necessary ports for communication within your VPC network. Use GCP firewall rules to restrict inbound and outbound traffic.

2. Enable Subnet Flow Logs:

  • Enable Logging: Enable flow logs for the subnets containing your Dataflow worker nodes. This will provide valuable information about network traffic, aiding in security monitoring and incident response. You can enable flow logs through the GCP console or using the gcloud command-line tool. The steps are:
    1. Log in to the GCP console.
    2. Navigate to VPC Network.
    3. Select the relevant VPC network and subnet.
    4. Edit the subnet settings.
    5. Set "Flow Logs" to "On".
    6. Save the changes.

3. Enable Private Google Access:

  • Enable Private Access: Enable Private Google Access for your VPC network. This allows your instances to access Google services without using public IP addresses, enhancing security and reducing latency.

4. Regularly Review and Update Security:

  • Continuous Monitoring: Regularly review your security configurations to ensure they remain effective. Implement a process for detecting and responding to security alerts.

Implementing these changes will not disrupt Prisma Cloud compression jobs running on the workflow worker nodes, correct?

  • 189 Views
  • 4 replies
  • 0 Likes
Like what you see?

Show your appreciation!

Click Like if a post is helpful to you or if you just want to show your support.

Click Accept as Solution to acknowledge that the answer to your question has been provided.

The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!

These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the LIVEcommunity as a whole!

The LIVEcommunity thanks you for your participation!