Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Metricbeat]Create kube-state-metrics metricset which includes all info #12938

Closed
odacremolbap opened this issue Jul 17, 2019 · 8 comments
Closed
Labels
containers Related to containers use case discuss Issue needs further discussion. Metricbeat Metricbeat Team:Integrations Label for the Integrations team

Comments

@odacremolbap
Copy link
Contributor

odacremolbap commented Jul 17, 2019

Describe the enhancement:

Currently kube-state-metrics info is rerieved using a range of metricsets.
Each metricset manages an object, targeting:

  • container
  • deployment
  • node
  • pod
  • replicaset
  • statefulset

This is a proposal to create a kube-state-metrics metricset that includes groups then all, including missing requested objects.

Benefits:

  • simplify configuration
  • resource wise, the metricset would be retrieving metrics once per cycle vs once per cycle per object.
  • increase kubernetes coverage

Cons:

  • cannot configure different frequency of retrieval per object

related: #7058

@odacremolbap odacremolbap added Metricbeat Metricbeat containers Related to containers use case discuss Issue needs further discussion. labels Jul 17, 2019
@jsoriano jsoriano added the Team:Integrations Label for the Integrations team label Jul 18, 2019
@exekias
Copy link
Contributor

exekias commented Jul 22, 2019

+1 to doing this. It would be helpful to include examples to drop the metrics the user may not want. It would be a nice workaround to support several reporting periods too.

Did you think about how to make this backwards compatible? sounds like you could release the new metriset and deprecate the rest at some point (to be removed in 8.0)

@odacremolbap
Copy link
Contributor Author

I thought about creating a new metricset kubestatemetrics, no compatibility with previous metricset, so your second option. No rush to deprecate existing one, if we ever decide that keeping a single metricset is better than the current implementation.

Regarding drop events and the enricher I think they can also be somehow improved.

  1. drop events

    If we are able to, we should drop events before processing, that's something that can't be easily standardized, but doable for sure.

  2. enricher

    This sounds a lot like a post processor, and as such, something optional, and that potentially can enrich any kubernetes metrics wether they are sourced from kube-state-metrics or from the kubelet

(I wrote a multi paragraph response to this, but the scope was growing too much, will keep them and manage to add them as issues as we evolve)

@exekias
Copy link
Contributor

exekias commented Jul 26, 2019

About enrichment, I hope we get opinionated about it. Making things optional means that users can disable them. If we use these fields in the UI or dashboards, disabling them would break the experience. We are trying to go in the other direction to avoid these issues, for instance: #13068

@exekias
Copy link
Contributor

exekias commented Oct 14, 2019

About enrichment, as this moves forward it may be ok to make this module work with it instead of hardcoding things in: #14044

@ChrsMark
Copy link
Member

Just a concern. Will this be scalable in the long run while adding more and more metrics from #7058 (comment)?

🤔

@roncohen
Copy link
Contributor

Maybe we let users specify the namespaces they are interested in similar to the generic AWS CloudWatch metrics module?

@exekias
Copy link
Contributor

exekias commented Oct 28, 2019

Because of how kube-state-metrics works, you get all metrics on a single call, so we are retrieving them anyway. If the user doesn't want to store some of them they can probably rely on drop_event processor to ignore those.

That said, we should pay attention to the output we get and it's size, it could be the case that we don't find some of the metrics interesting enough and can avoid reporting them.

@exekias
Copy link
Contributor

exekias commented Mar 19, 2020

Closing this as we abandoned this idea in favor of the current approach

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
containers Related to containers use case discuss Issue needs further discussion. Metricbeat Metricbeat Team:Integrations Label for the Integrations team
Projects
None yet
Development

No branches or pull requests

6 participants