You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
alert http any any -> $HOME_NET any (msg:"OTX - FILE MD5 from pulse Inside the spyware campaign against Argentine troublemakers"; filemd5:55d79cc967db8c7bb8cb5a72.txt; reference: url, otx.alienvault.com/pulse/55d79cc967db8c7bb8cb5a72; sid:414932; rev:1;)
alert http any any -> $HOME_NET any (msg:"OTX - FILE MD5 from pulse Macro Downloaders (Aga Dell)"; filemd5:58c69a109c4484412c9d2a3b.txt; reference: url, otx.alienvault.com/pulse/58c69a109c4484412c9d2a3b; sid:414932; rev:1;)
The text was updated successfully, but these errors were encountered:
Doesn't seem like this tool is ever going to be updated to account for this.
Here's a script I'm using to revise duplicate SIDs...
#Loop over duplicate SIDs
for sid in $(grep -P -o "(?<=sid:)\d*" /etc/suricata/rules/otx_file_rules.rules | sort | uniq -d); do
increment=0
#Loop over lines with duplicate SIDs, stripping out first match
grep "$sid" /etc/suricata/rules/otx_file_rules.rules | sed 1d | while read -r line ; do
#Increment the incrementor
increment=$((increment+1))
#Get unique non-sid match
filemd5=$(echo $line | sed 's/.*filemd5:\([^;]*\).*/\1/')
#Postfix each sid with incrementor to hopefully generate unique IDs
sed -i "s/\($filemd5.*sid:[^;]*\)/\1$increment/" /etc/suricata/rules/otx_file_rules.rules
done
done
It might be better to insert a GID on each duplicate to make them unique, rather than revising the SIDs.
e.g.
The text was updated successfully, but these errors were encountered: