-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make elasticsearch/ccr metricset work for Stack Monitoring without xpack.enabled flag #21348
Make elasticsearch/ccr metricset work for Stack Monitoring without xpack.enabled flag #21348
Conversation
Pinging @elastic/stack-monitoring (Stack monitoring) |
Pinging @elastic/integrations-services (Team:Services) |
I'm not seeing these stats come through. I'm running this PR and this query only returns
->
|
Is it possible that this is because the Elasticsearch node is not setup with CCR? The response from Elasticsearch is almost empty if not: {
"auto_follow_stats": {
"number_of_failed_follow_indices": 0,
"number_of_failed_remote_cluster_state_requests": 0,
"number_of_successful_follow_indices": 0,
"recent_auto_follow_errors": [],
"auto_followed_clusters": []
},
"follow_stats": {
"indices": []
}
} And the output in the metricset will probably be empty too. Is it expected to return something else? By looking at the code, it doesn't seem so. |
Thanks @sayden! It must have been my mistake. I'm seeing it now! Here is the query we run from Kibana:
I'm not seeing any aliases currently setup so I'm getting this failure:
|
Chris, I don't know of any |
@sayden Yea I'm not sure why it's not showing up in the mapping file, so I apologize there. Here is the full list of fields Let's make sure all of those are in the document, but you need to map:
|
a6315b8
to
4da617e
Compare
Okay @chrisronline I have added all the fields in the list of your link but I still couldn't generate a It was a bit tricky to add all those fields so just copy-paste here if you see any issue. Thanks! |
🐛 Flaky test report❕ There are test failures but not known flaky tests. Expand to view the summary
Test stats 🧪
Genuine test errors💔 There are test failures but not known flaky tests, most likely a genuine test failure.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
9519820
to
1710b82
Compare
@ycombinator I have added on the mapping the ton of fields that are required for CI to pass. I'm thinking to remove many fields from all metricsets once we have the feature branch ready to merge and the one of kibana so that I can also do some testing on my local. WDYT? |
I didn't quite understand this, sorry. Why are there some extra fields that are needed now for CI to pass but can be removed later? How will we know which fields these are exactly so we can remove them all later without missing any? What would happen if we removed them now itself - why would CI start failing? IOW, I'm trying to understand why this PR can't contain exactly those fields that were already present before this PR (this way we don't introduce breaking changes) + newer fields needed by the Stack Monitoring UI. Why does CI fail with only these fields? |
Filebeat and Metricbeat have some internal test called I think this test was being launched against the non x-pack flow only. Now that the module has all the fields from x-pack, the test is complaining and that's why I have to add them. The reason to add them all now and remove the ones that are not necessary later is because it's easier to give everything to Chris now so that he can work on his side and once everything is done, and before merging into master, I can play on my own with @chrisronline Kibana's branch and Beats feature branch removing fields while testing on Kibana by myself. It's double the work for me but I think it's more reasonable than removing some fields, then bothering Chris to check, then removing more until something gets broken and then revert. That would be slower and I can remove while testing in a single PR later. Also, most metricsets just have an schema to "apply" so it's easy to remove fields from there and from the |
58e2a7d
to
9de5959
Compare
# Conflicts: # metricbeat/docs/fields.asciidoc # metricbeat/module/elasticsearch/fields.go # metricbeat/module/elasticsearch/node_stats/_meta/fields.yml
a22d3a9
to
27281b5
Compare
27281b5
to
bb431af
Compare
Thanks for explaining. This approach makes sense to me. My main concern is that we'll end up collecting/indexing too many fields, which brings a cost to it (e.g. we're starting to see some support issues with the So I'm good with deferring this "tuning" to the end — after all PRs are merged into the feature branch but before the feature branch is merged into |
@@ -1,39 +0,0 @@ | |||
{ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@sayden Looks like this file got deleted. Any chance you could regenerate it? The other Stack Monitoring PRs have it so it would be good to have it in this PR too.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay. I have added a TestData
function to work with mock input data and now we have a data.json
🎉
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just left one comment requesting to generate and add data.json
to this PR.
jenkins test this |
/test metricbeat |
Okay! CI is green so I guess that it's ready or I'm close. @ycombinator can you take a look when you have some time, please? 🙂 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM.
…csearch/ccr_xpack_flag # Conflicts: # metricbeat/module/elasticsearch/fields.go
/test metricbeat |
Green CI again finally! Merging! |
Ready to test in Kibana. data.json is included to help troubleshooting and it has been generating testing the Metricbeat binary directly with Elasticsearch