You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the case of an error, the raw message gets dumped.
It happened to me that the same message got dumped again and again (due to an error in alien-vault-otx). When the dump exceeded 8 GB, the json module wasn't able to load it and bot.py stucked at dump_data = json.load(fp). In the morning, the bot was stopped, no hint in logs, nothing nowhere.
In the case of dump, shouldn't we check for exactly the same data in the file first? We may add a field occurences_number or so to the dumpfile json.
The text was updated successfully, but these errors were encountered:
See also #574#869#870
Saving a message per line does fix the 8GB limit error too. And then we could even check if the last existing line is the same as the one we are just trying to dump.
If we implement this occurences_number we need to recover it multiple times too.
But primarily we should fix the bugs in the bots themselves so this can't happen at all.
In the case of an error, the raw message gets dumped.
It happened to me that the same message got dumped again and again (due to an error in alien-vault-otx). When the dump exceeded 8 GB, the json module wasn't able to load it and
bot.py
stucked atdump_data = json.load(fp)
. In the morning, the bot was stopped, no hint in logs, nothing nowhere.In the case of dump, shouldn't we check for exactly the same data in the file first? We may add a field
occurences_number
or so to the dumpfile json.The text was updated successfully, but these errors were encountered: