-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Filebeat] kafka v2 using parsers #27335
[Filebeat] kafka v2 using parsers #27335
Conversation
💚 Build Succeeded
Expand to view the summary
Build stats
Test stats 🧪
Trends 🧪💚 Flaky test reportTests succeeded. Expand to view the summary
Test stats 🧪
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks really good! The only obvious thing I see that needs work (other than updating generated files to make the linter happy) is handling of expandEventListFromField
. Otherwise everything looks like a clear improvement over the previous version :-)
Fixing lint issues
Think should be pretty good now, I do still need to add documentation and update the description of the PR I adapted the Ack for Picked a reader over a parser because if we introduce generic behavior we might want different name / location |
couldn't reuse |
Pinging @elastic/agent (Team:Agent) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor comments, we are going in the right direction. Thanks!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks great, thank you!
* using input v2 adding config for parsers * implemented reader and parsers * implemented expandEventListFromField in separate reader Fixing lint issues * Adding documentation * Adding plugin test method * making parseMultipleMessages a function of listFromFieldReader * changing order of return values * Adding replacement for input v1 * lint :( (cherry picked from commit 20d6038)
* master: (39 commits) [Heartbeat] Move JSON tests from python->go (elastic#27816) docs: simplify permissions for Dockerfile COPY (elastic#27754) Osquerybeat: Fix osquery logger plugin severy levels mapping (elastic#27789) [Filebeat] Update compatibility function to remove processor description on ES < 7.9.0 (elastic#27774) warn log entry and no validation failure when both queue_url and buck… (elastic#27612) libbeat/cmd/instance: ensure test config file has appropriate permissions (elastic#27178) [Heartbeat] Add httpcommon options to ZipURL (elastic#27699) Add a header round tripper option to httpcommon (elastic#27509) [Elastic Agent] Add validation to ensure certificate paths are absolute. (elastic#27779) Rename dashboards according to module.yml files for master (elastic#27749) Refactor vagrantfile, add scripts for provisioning with docker/kind (elastic#27726) Accept syslog dates with leading 0 (elastic#27775) [Filebeat] Add timezone config option to decode_cef and syslog input (elastic#27727) [Filebeat] Threatintel compatibility updates (elastic#27323) Add support for ephemeral containers in elastic agent dynamic provider (elastic#27707) [Filebeat] Integration tests in CI for AWS-S3 input (elastic#27491) Fix flakyness of TestFilestreamEmptyLine (elastic#27705) [Filebeat] kafka v2 using parsers (elastic#27335) Update Kafka version parsing / supported range (elastic#27720) Update Sarama to 1.29.1 (elastic#27717) ...
* using input v2 adding config for parsers * implemented reader and parsers * implemented expandEventListFromField in separate reader Fixing lint issues * Adding documentation * Adding plugin test method * making parseMultipleMessages a function of listFromFieldReader * changing order of return values * Adding replacement for input v1 * lint :(
What does this PR do?
Moving it forward to version 2 of the input code removes tech dept. Adding parsers allows the use of parsers, allowing the payload to be interpenetrated in a flexible way which for example fixes the issue outlined in #26833 See also #26130 & #15324
Why is it important?
Addresses tech dept, allows the kafka input to be used in more setups.
Checklist
CHANGELOG.next.asciidoc
orCHANGELOG-developer.next.asciidoc
.How to test this PR locally
See the integration tests for the kafka input
Related issues
parsers
in all Filebeat inputs #26130 Filebeat input v2 API #15324Use cases
Having parsers allows the payload to be interpreted in a more flexible way, for example json can be picked up as structured data. This means that if the data is in the correct structure it can be picked up by a module. #27154 and #26862 would also allows us to further correct things on the FB side but this PR allows preprocessing to happen in an external logstash or filebeat in front of the filebeat with the module