-
-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
121 logging improvements #122
Conversation
This sets up the basic stubs which can be used for later.
So far the main CLI sub-commands report their errors appropriately but the actual application/processor does not yet take advantage of this failure. Sample usage shows how I'm using custom fields to log things:
|
The various sub-commands now log via the logger, and I've set things up so the processor can have a logging-handle, via The final step will be to update the processor to log appropriately, ideally with per-feed context. |
Current state:
There is missing logging from the "withstate" package, and the "email sending" stuff is still basically undocumented. The HTTP-fetching code probably needs to have logging as that's a common source of issues. (Specifically I get "processing feed xxxx" then if the remote host is down I have to wait for the fetch to timeout with zero progress, and sleep-delays. Just adding attempt N of Max logs would suffice.) I also need to go through and work out what to do with the boltdb updates - are they fatal errors or warnings? I'm extremely pleased with the structured logging approach, especially now I get source/function notes in the output too. |
This is useful for showing the retries and failures
Changes now:
Outstanding tasks:
However in both cases errors are returned and are sane, so perhaps that is enough? Will ponder for the next few days. |
… will bump log-levels
We're now in good shape:
Finally we've got only a few $ rgrep fmt.Printf . |grep -v .git
grep: ./rss2email: binary file matches
./daemon_cmd.go: fmt.Printf("Usage: rss2email daemon email1@example.com .. emailN@example.com\n")
./daemon_cmd.go: fmt.Printf("Usage: rss2email daemon [flags] email1 .. emailN\n")
./cron_cmd.go: fmt.Printf("Usage: rss2email cron email1@example.com .. emailN@example.com\n")
./cron_cmd.go: fmt.Printf("Usage: rss2email cron [flags] email1 .. emailN\n")
./seen_cmd.go: fmt.Printf("%s\n", buck)
./seen_cmd.go: fmt.Printf("\t%s\n", key)
./unsee_cmd.go: fmt.Printf("Please specify the URLs to unsee (i.e. remove from the state database).\n") I can't think of anything to tighten here, except perhaps the handling of the errors here is a bit redundent:
If there are errors now we see that twice:
Here you see a lot of noise, but summarizing:
At the end of the run we see the list of errors, which in this case just shows the same thing:
|
In #122 we updated our codebase to ensure that we used a common logging handle to send our output messages to the user: * This included warnings and errors by default. * However "developer", or other internal messages, were also available. This pull-request builds upon that work to write the log messages to a file, as well as showing them to the user. By default a file `rss2email.log` is generated when the application starts and log entries are appended to it. However: * LOG_FILE_PATH can be set to a different path. * LOG_FILE_DISABLE can be used to disable this duplication. Why do this? Partly for reference, but also partly to allow a local setup to view errors. We could now resolve #119 by adding a cronjob: #!/bin/sh # show logs, and delete them, run once per day cat .../rss2email.log rm .../rss2email.log (Of course this might not work 100% as the current approach assumes the file is open forever in the case of `daemon` sub-command.) This isn't a complete solution, but without getting into the whole template-customization and more complex considerations it's not a terrible thing to do.
In #122 we updated our codebase to ensure that we used a common logging handle to send our output messages to the user: * This included warnings and errors by default. * However "developer", or other internal messages, were also available. This pull-request builds upon that work to write the log messages to a file, as well as showing them to the user. By default a file `rss2email.log` is generated when the application starts and log entries are appended to it. However: * LOG_FILE_PATH can be set to a different path. * LOG_FILE_DISABLE can be used to disable this duplication. Why do this? Partly for reference, but also partly to allow a local setup to view errors. We could now resolve #119 by adding a cronjob: #!/bin/sh # show logs, and delete them, run once per day cat .../rss2email.log rm .../rss2email.log (Of course this might not work 100% as the current approach assumes the file is open forever in the case of `daemon` sub-command.) This isn't a complete solution, but without getting into the whole template-customization and more complex considerations it's not a terrible thing to do.
This pull-request, once complete, will close #121 by improving our logging. Improving in this case means that we're very consistent with how we log messages, and their contents.
All logging messages will be written to STDERR, and by default that will be done in "plain text", however it is possible to use JSON for ingestion to other systems, and this will be documented in the README.md file.