Issues with CSV report in PrismaCloud Compute

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Announcements
Please sign in to see details of an important advisory in our Customer Advisories area.

Issues with CSV report in PrismaCloud Compute

L1 Bithead

Hi all,

I'm facing with an annoying issue with CSV reports in PrismaCloud Compute, Compliance Explorer.

I upload the CSV report and open it with excell: when I try to convert with "text to column" I always find out that some raw ahd already splitted into tw0 or more column and I'm force to edit them beforehand.

Has anybody a quick solution to overcome this issue ?

4 REPLIES 4

L1 Bithead

Hi PMaggioni,

 

One simple check you can run before importing your csv file is to make sure there aren't any extra commas on each line. For example this command will output the line number with corresponding count of commas:

awk -F ',' '{print NR ":" NF-1}' your-file.csv

Hope this helps.

Julian

Helping protect our customers' digital way of life.

Hi Julina,

unfortunately it doesn't help, I need something able to skip the problem not to detect it ( to detect the issue is enough to open the csv with excell ). 

I've just discovered that LibreOffice is much better than excell in importing the csv, at least you got it with the right columns and just the extraones to delete.

I understand that it seems Prisma is defective to export a CSV file with extra delimiters embedded within the values.

Here's how I would fix the problem in less time than it takes to manage this discussion.

0) Work with a SMALL BATCH size of records; (ie microservices, smallish user stories) you will realize efficiency when using a small batch of CSV records to locate and fix the root-cause.

1) @JNeytchev offered good advice for locating your root causes. Now that you have a stream of field counts, identify the exceptions, perhaps, count them to determine if you will pivot or persist down this path.

2) You need an editor like (ie vi, vim) that can perform "stream edits" over sections or entire file.

3) Manually search the file, perhaps, developing a regular expression that can detect only the defective fields.

4) This is another pivot point, is the problem-set small? so its faster to manually fix or do I need a "batch search and replace" command.

5) Enhance that search regex to develop a "search and replace" command.

6) then test/fix it to the small batch; then apply to the entire batch.

Good luck!

Tommy Hunt AWS-CSA, Java-CEA, PMP, SAFe Program Consultant
thunt@citrusoft.org
https://www.citrusoft.org

Thanks for the answer, but it's not very usefull.

0) Work with a SMALL BATCH size of records; (ie microservices, smallish user stories) you will realize efficiency when using a small batch of CSV records to locate and fix the root-cause. In Prisma Cloud Compliancy report, even if you select a small set ( for example all the image running as root ) you get a larger report with all the issues for the involved image.

1) @JNeytchev offered good advice for locating your root causes. -> it's enough to open the .csv with excell to findout the raw with issue.

2) You need an editor like (ie vi, vim) that can perform "stream edits" over sections or entire file. Too tiem consuming. Anyway Libreoffice seems the best solution

3) Manually search the file, perhaps, developing a regular expression that can detect only the defective fields. ALready tried, too different situations. ( same for the othert suggestions, at this point Libreoffice and a lot of time seems the best approach ).

 

  • 1656 Views
  • 4 replies
  • 0 Likes
Like what you see?

Show your appreciation!

Click Like if a post is helpful to you or if you just want to show your support.

Click Accept as Solution to acknowledge that the answer to your question has been provided.

The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!

These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the LIVEcommunity as a whole!

The LIVEcommunity thanks you for your participation!