- Access exclusive content
- Connect with peers
- Share your expertise
- Find support resources
04-04-2025 09:27 AM
We've been tracking the scanning campaign that made the news recently concerning scanning and dictionary attacks against GlobalProtect portals. We've been able to put together a parser to pull and deduplicate IP addresses from our logs for those attempts. Anyone doing something similar?
Palo Alto, would there be interest in ingesting these somehow to create a block list of some kind? I'm sure it's the same actors across multiple customers.
04-04-2025 11:39 AM
Hi @jstrubberg ,
Here is a detailed discussion of the issue -> https://live.paloaltonetworks.com/t5/globalprotect-discussions/use-auto-tagging-to-block-failed-glob....
Here is another post that uses automation to update an EDL -> https://live.paloaltonetworks.com/t5/general-topics/automatically-blocking-ip-s-after-a-certain-numb....
Thanks,
Tom
04-04-2025 11:57 AM
Thanks Tom. The tagging option seems like it could land you in a false positive situation if you aren't very careful with the timing of the tag. As for the brute force blocking, in my situation I don't think it would trigger. Today for instance, I've logged about 1100 attempts from 614 distinct IP addresses.
What I do have is a very distinct error message in the system log that never happens to a registered user. I'm using that to parse out bad actor IPs. My next step is to feed that list back to the logs and see if they ever hit anything other than my GlobalProtect portal to confirm it's nothing but bad actor traffic, and then I will build an EDL with them.
Was just wondering if there might be a large movement afoot after seeing this week's new article come out about the subject.
05-05-2025 01:25 PM
So all of the solutions I have seen are a bit general and seem to need tuning to the specific environment in order to be effective. The attacks we have been seeing are very predictable...I suspect there a script or 'sploit plugin being used. I wrote a powershell script to look for a specific error, parse through that error and harvest the origin IP, deduplicate the IP results and write them out to a text file that can then be used as an EDL and included in a block rule. After running this script a few times a day over three days, I had a large enough IP list to stop the attacks completely for our environment. Hope this is helpful to someone else. Please pay attention to the notes in the script...you will need to change a few things to run this in your environment.
______________________
#Run these two powershell commands once on the machine you intend to run the script from in order to safely store your credential
#1. $cred = Get-Credential
#2. $cred | Export-Clixml -Path "$env:APPDATA\PaloAltoCredDR.xml"
# Configuration
$FirewallHost = "ManagementURL" # Replace with your firewall's hostname or IP
$CredFile = "$env:APPDATA\PaloAltoCredDR.xml"
$OutputFile = "C:\Temp\GPAttackIPs.txt" #Replace with the location of the file for your EDL list
# Load credentials securely
$creds = Import-Clixml -Path $CredFile
$Username = $creds.UserName
$Password = $creds.GetNetworkCredential().Password
# URL encode credentials
$EncUsername = [uri]::EscapeDataString($Username)
$EncPassword = [uri]::EscapeDataString($Password)
# Request new API key
$KeyGenUrl = "https://$FirewallHost/api/?type=keygen&user=$EncUsername&password=$EncPassword"
try {
$keyResponse = Invoke-RestMethod -Uri $KeyGenUrl -Method Get -UseBasicParsing
$ApiKey = $keyResponse.response.result.key
} catch {
Write-Error "Failed to generate API key: $_"
exit
}
# Build timestamp for 3 days ago in UTC formatted as YYYY/MM/DD HH:MM:SS
$startTime = (Get-Date).AddDays(-3).ToUniversalTime().ToString("yyyy/MM/dd HH:mm:ss")
$Query = "(receive_time geq '$startTime') and (description contains 'DNS failure or remote server down')" #This is the common "tell" in my logs for these login attacks. You can modify this search term if you need to.
$EncodedQuery = [uri]::EscapeDataString($Query)
# Submit the system log query with nlogs in the URL
$SubmitUrl = "https://$FirewallHost/api/?type=log&log-type=system&query=$EncodedQuery&nlogs=5000&key=$ApiKey"
try {
$submitResponse = Invoke-RestMethod -Uri $SubmitUrl -Method Get -UseBasicParsing
$jobID = $submitResponse.response.result.job
if (-not $jobID) {
Write-Error "No job ID returned."
exit
}
Write-Host "Log query submitted as job #$jobID"
} catch {
Write-Error "Failed to submit log query: $_"
exit
}
# Poll job status until completion
$JobDone = $false
$StatusUrl = "https://$FirewallHost/api/?type=log&action=get&job-id=$jobID&nlogs=5000&dir=backward&key=$ApiKey"
for ($i = 0; $i -lt 20; $i++) {
Start-Sleep -Seconds 2
try {
$statusResponse = Invoke-RestMethod -Uri $StatusUrl -Method Get -UseBasicParsing
$status = $statusResponse.response.result.job.status
if ($status -eq "FIN") {
$JobDone = $true
break
}
} catch {
Write-Warning "Retrying status check..."
}
}
if (-not $JobDone) {
Write-Error "Job $jobID did not finish in time."
exit
}
# Parse job results
$entries = $statusResponse.response.result.log.logs.entry
$entryCount = if ($entries) { $entries.Count } else { 0 }
Write-Host "Entries found: $entryCount"
if (-not $entries) {
Write-Host "No matching logs found."
Set-Content -Path $OutputFile -Value "" -Encoding UTF8
exit
}
# Extract and deduplicate IPs from opaque field
$OutputIPs = @()
foreach ($entry in $entries) {
$desc = $entry.opaque
if ($desc -match "From:\s*(\d{1,3}(?:\.\d{1,3}){3})") {
$ip = $matches[1]
$OutputIPs += $ip
}
}
Write-Host "IPs extracted: $($OutputIPs.Count)"
$UniqueIPs = $OutputIPs | Sort-Object -Unique
$UniqueIPs | Set-Content -Path $OutputFile -Encoding UTF8
Write-Host "Written $($UniqueIPs.Count) unique IPs to $OutputFile"
_______________________________________________________
Click Accept as Solution to acknowledge that the answer to your question has been provided.
The button appears next to the replies on topics you’ve started. The member who gave the solution and all future visitors to this topic will appreciate it!
These simple actions take just seconds of your time, but go a long way in showing appreciation for community members and the LIVEcommunity as a whole!
The LIVEcommunity thanks you for your participation!