Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: mixin / add loki compaction not successfull alert #14239

Merged

Conversation

QuentinBisson
Copy link
Contributor

What this PR does / why we need it:

We recently had issues with Loki's compactor as we found out it actually never once ran successfully so we added an alert and I thought it was a good idea to add it here as well.

This alert was inspired from https://github.com/grafana/mimir/blob/41a6c6d0521f7994a0c7e031c7f35bafd45aa75c/operations/mimir-mixin-compiled/alerts.yaml#L876

Which issue(s) this PR fixes:
Fixes #

Special notes for your reviewer:

Checklist

  • Reviewed the CONTRIBUTING.md guide (required)
  • Documentation added
  • Tests updated
  • Title matches the required conventional commits format, see here
    • Note that Promtail is considered to be feature complete, and future development for logs collection will be in Grafana Alloy. As such, feat PRs are unlikely to be accepted unless a case can be made for the feature actually being a bug fix to existing behavior.
  • Changes that require user attention or interaction to upgrade are documented in docs/sources/setup/upgrade/_index.md
  • For Helm chart changes bump the Helm chart version in production/helm/loki/Chart.yaml and update production/helm/loki/CHANGELOG.md and production/helm/loki/README.md. Example PR
  • If the change is deprecating or removing a configuration option, update the deprecated-config.yaml and deleted-config.yaml files respectively in the tools/deprecated-config-checker directory. Example PR

@QuentinBisson QuentinBisson marked this pull request as ready for review September 24, 2024 10:05
@QuentinBisson QuentinBisson requested a review from a team as a code owner September 24, 2024 10:05
expr: |||
# The "last successful run" metric is updated even if the compactor owns no tenants,
# so this alert correctly doesn't fire if compactor has nothing to do.
(time() - loki_compactor_apply_retention_last_successful_run_timestamp_seconds > 60 * 60 * 24)
Copy link
Contributor

@ashwanthgoli ashwanthgoli Sep 25, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
(time() - loki_compactor_apply_retention_last_successful_run_timestamp_seconds > 60 * 60 * 24)
(time() - loki_boltdb_shipper_compact_tables_operation_last_successful_run_timestamp_seconds > 60 * 60 * 24)

might be better to use the compaction metric instead of the last successful retention run.
metric name is misleading here, it refers to boltdb but it is updated for tsdb indexes as well.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks I might give metrics renaming a go if I have time

@pull-request-size pull-request-size bot added size/L and removed size/M labels Sep 25, 2024
@QuentinBisson QuentinBisson force-pushed the feat-mixin-add-compaction-alert branch from 9b0cb43 to a13601c Compare September 26, 2024 14:43
time() - (loki_boltdb_shipper_compact_tables_operation_last_successful_run_timestamp_seconds{} > 0)
)
by (%s, namespace)
> 60 * 60 * 24
Copy link
Contributor

@ashwanthgoli ashwanthgoli Oct 1, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: might be worth using a shorter interval for both alerts like 3h? there would be query performance degradation if there are a lot of uncompacted indexes

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes sense yes :)

Copy link
Contributor

@ashwanthgoli ashwanthgoli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. thank you! would be ideal if we could reduce the alert period as well :)

@QuentinBisson
Copy link
Contributor Author

Done :)

@QuentinBisson
Copy link
Contributor Author

I think the failing checks are unrelated to this PR :D

@ashwanthgoli ashwanthgoli merged commit da04f50 into grafana:main Oct 2, 2024
60 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants