You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This issue is to provide Rover with a persisted log file collecting all logs emitted by Rover. Log file should be machine readable, and with appropriate file locks/unlocks, be written from multiple processes.
Requirements
Log file should be stored in their own directory along side Rover's config. (Should we consider system-wide log storage?)
Logs should be able to be written from multiple processes.
We should write all levels of logs.
Logs should be machine-readable - either the structured logs that tracing crates provides, or ndjson. This will depend on what we decide to do in logging: replace tracing crate with femme #223. Essentially, let's have logs store raw event data.
Logs should not contain sensitive information, i.e. API keys. (This point can be discussed, if anyone feels strongly that logs should also have API keys)
Decisions to make
We should decide whether local user logging is on by default.
We should decide how often to create a new file. As in, do we write to a new file every 24hrs or every week? I am keen on every 24hrs, so the files can be named by timestamp.
We should think about doing a clean up of log files. This can perhaps be done as part of another issue.
Writing to file
To make sure that multiple processes can write to the same file without a race condition, I propose we use advisory locks. There is a very good JavaScript library that was ported to Rust, that I think fits our bill. It essentially allows for a single processes to write to a file descriptor (fd) by locking other access. If another process wants access, the file system will put its claim on hold and provide a backpressure mechanism for all other incoming data from that process until the fd is unlocked.
Essentially, once we finish #220, we will have our logging in one place. From there, we can write all information with:
letmut log_file = File::create("20210502.log")?
letmut f = FdLock::new(log_file);
f.lock()?.write_all(b"log message")?;
Advisory locks, an illustration
The text was updated successfully, but these errors were encountered:
Description
This issue is to provide Rover with a persisted log file collecting all logs emitted by Rover. Log file should be machine readable, and with appropriate file locks/unlocks, be written from multiple processes.
Requirements
tracing
crates provides, orndjson
. This will depend on what we decide to do in logging: replace tracing crate with femme #223. Essentially, let's have logs store raw event data.Decisions to make
Writing to file
To make sure that multiple processes can write to the same file without a race condition, I propose we use advisory locks. There is a very good JavaScript library that was ported to Rust, that I think fits our bill. It essentially allows for a single processes to write to a file descriptor (fd) by locking other access. If another process wants access, the file system will put its claim on hold and provide a backpressure mechanism for all other incoming data from that process until the fd is unlocked.
Essentially, once we finish #220, we will have our logging in one place. From there, we can write all information with:
Advisory locks, an illustration
The text was updated successfully, but these errors were encountered: