- Access exclusive content
- Connect with peers
- Share your expertise
- Find support resources
05-08-2022 04:49 AM
Hi,
I have a seemingly simple task which i can't figure out how to handle. I want to import a csv file in the context, having column names as the subkeys. And then I want to get rid off some excessive subkeys/columns.
My steps are:
1. upload a file from file share using smb-download - SUCCESS
2. Import file in the context keys using ParseCSV automation - SUCCESS
3. Set a new key to only store selected subkeys - FAIL
for the last, i'm using:
!SetMultipleValues parent=testKey keys="Name,Surname" values="${ParseCSV.ParsedCSV.Name},${ParseCSV.ParsedCSV.Surname}"
it seems that SetMultipleValues does not recognize that there is more than one value on the same subkey (i.e. ParseCSV.ParsedCSV.[0].Name, ParseCSV.ParsedCSV.[1].Name, ParseCSV.ParsedCSV.[2].Name etc) and is not running on all subkeys. What am i missing? Is there another easy way to take selected subkeys and store it on the separate subkeys?
Using XSOAR 6.6 for this.
Thank you,
Antanas
05-09-2022 09:50 PM
Hi Antanas,
You will need to create a custom automation for this. The automation will have to loop the array and dict to remove the columns. I'm attaching some code I had written that does something similar. I had to process a CSV file and save specific columns.
## Script to strip excess columns and only keep columns identified in the columnsToKeep field
# columnsToKeep - If no columns are matched a blank file will be created
# Arguments to the script
fileID = demisto.args()['fileEntryID']
fileName = demisto.args()['FileName']
columnsToKeep = demisto.args()['columnsToKeep']
# Output array
newData = []
# Convert file input into array of dict
data = demisto.executeCommand('ParseCSV', { 'entryID' : fileID })[0]['Contents']
# Loop through items and copy matched columns
for item in data:
temp = {}
for key in item:
if key in columnsToKeep:
temp[key] = item[key]
newData.append(temp)
# Output data to csv file
demisto.results(demisto.executeCommand('ExportToCSV', { 'fileName' : 'stripped_' + str(fileName) + '.csv', 'csvArray' : newData }))
05-09-2022 09:50 PM
Hi Antanas,
You will need to create a custom automation for this. The automation will have to loop the array and dict to remove the columns. I'm attaching some code I had written that does something similar. I had to process a CSV file and save specific columns.
## Script to strip excess columns and only keep columns identified in the columnsToKeep field
# columnsToKeep - If no columns are matched a blank file will be created
# Arguments to the script
fileID = demisto.args()['fileEntryID']
fileName = demisto.args()['FileName']
columnsToKeep = demisto.args()['columnsToKeep']
# Output array
newData = []
# Convert file input into array of dict
data = demisto.executeCommand('ParseCSV', { 'entryID' : fileID })[0]['Contents']
# Loop through items and copy matched columns
for item in data:
temp = {}
for key in item:
if key in columnsToKeep:
temp[key] = item[key]
newData.append(temp)
# Output data to csv file
demisto.results(demisto.executeCommand('ExportToCSV', { 'fileName' : 'stripped_' + str(fileName) + '.csv', 'csvArray' : newData }))
05-09-2022 10:57 PM
Thanks! I definitely will try it. I also finally found a workaround for this - i import the whole thing with with parseCSV, and then use a transformer GetFields in SetAndHandeEmpty to save the selected keys. Then i can delete the the initial one. Of course, your suggested solution is more optimal.
05-09-2022 11:16 PM
One more thing to keep in mind is the size of the CSV file. Adding a lot of data to context might affect incident load times and also create large indexes. The script solution takes a file as input and outputs another. No indexing is performed on a file's contents.
Click Accept as Solution to acknowledge that the answer to your question has been provided.
The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!
These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the LIVEcommunity as a whole!
The LIVEcommunity thanks you for your participation!