Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

obvervability for cloud_bench #1688

Open
1 of 3 tasks
Tracked by #890
stepashka opened this issue May 12, 2022 · 2 comments
Open
1 of 3 tasks
Tracked by #890

obvervability for cloud_bench #1688

stepashka opened this issue May 12, 2022 · 2 comments
Labels
a/benchmark Area: related to benchmarking
Milestone

Comments

@stepashka
Copy link
Member

stepashka commented May 12, 2022

add metrics to track what cloud_bench does, and what the config of the run was

metrics ideas:

  • count of projects attempted to create (tasks?)
  • error count of tasks (with commands as labels)
  • histogram task duration for finished tasks(pgbench with commands as labels)
  • run label: user+time?
  • profile label
  • maybe gauge/count of tasks that need to be scheduled?
  • anything else?
@stepashka stepashka added the a/benchmark Area: related to benchmarking label May 12, 2022
@neondatabase-bot neondatabase-bot bot added this to the 1.0 Technical preview milestone May 12, 2022
@stepashka stepashka changed the title add metrics to track what cloud_bench does, and what the config of the run was obvervability for cloud_bench May 12, 2022
@chaporgin
Copy link
Member

chaporgin commented May 16, 2022

added metrics to cap test dashboad

@chaporgin
Copy link
Member

run label: user+time?

added user. putting time into a label will give us big cardinality, IUUC. Should we really do that?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
a/benchmark Area: related to benchmarking
Projects
None yet
Development

No branches or pull requests

2 participants