Skip to content

Commit

Permalink
[plugins] speedup big journal collection
Browse files Browse the repository at this point in the history
Instead of having journalctl format all the logs,
estimate the number of lines of logs needed to fill
the log_size limit.
On a EL 9.5 server with SSD with 500MB of formated logs,
the logs runtime goes from 32s to 11s.

Signed-off-by: Etienne Champetier <e.champetier@ateme.com>
  • Loading branch information
champtar committed Dec 11, 2024
1 parent 678ea65 commit 938d6d5
Showing 1 changed file with 11 additions and 1 deletion.
12 changes: 11 additions & 1 deletion sos/report/plugins/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -3083,10 +3083,20 @@ def add_journal(self, units=None, boot=None, since=None, until=None,
if output:
journal_cmd += output_opt % output

fname = journal_cmd
if log_size > 0 and not lines:
# get the last 1000 lines
res = self.exec_cmd(f"{journal_cmd} -n1000")
if res['status'] == 0 and res['output'].count('\n') >= 1000:
# compute how many lines we need to reach log_size from
# the 1000 lines size, add 50% margin to avoid being short
lines = int(log_size*1024*1024/len(res['output'])*1000*1.5)
journal_cmd += lines_opt % lines

self._log_debug(f"collecting journal: {journal_cmd}")
self._add_cmd_output(cmd=journal_cmd, timeout=timeout,
sizelimit=log_size, pred=pred, tags=tags,
priority=priority)
priority=priority, suggest_filename=fname)

def _expand_copy_spec(self, copyspec):
def __expand(paths):
Expand Down

0 comments on commit 938d6d5

Please sign in to comment.