Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logstash Pipeline failed --DNS filter could not perform reverse lookup on missing field #13975

Open
aattelli91 opened this issue Apr 6, 2022 · 1 comment

Comments

@aattelli91
Copy link

aattelli91 commented Apr 6, 2022

Logstash information:

Please include the following information:

  1. Logstash version (e.g. bin/logstash --version)
    => 8.0.0 , 8.1.0 , 8.1.1, 8.1.2
  2. Logstash installation source (e.g. built from source, with a package manager: DEB/RPM, expanded from tar or zip archive, docker)
    rpm
  3. How is Logstash being run (e.g. as a service/service manager: systemd, upstart, etc. Via command line, docker/kubernetes)
    Systemd
    Plugins installed: (bin/logstash-plugin list --verbose)

JVM (e.g. java -version):
11.0.13+8

If the affected version of Logstash is 7.9 (or earlier), or if it is NOT using the bundled JDK or using the 'no-jdk' version in 7.10 (or higher), please provide the following information:

  1. JVM version (java -version)
  2. JVM installation source (e.g. from the Operating System's package manager, from source, etc).
  3. Value of the LS_JAVA_HOME environment variable if set.

OS version (uname -a if on a Unix-like system):
el7.x86_64

Description of the problem including expected versus actual behavior:
We have recently upgraded the logstash from 7.17.0 to 8.0.0 from then the pipeline stopped parsing the logs and below the error we noticed
Errors noticed for udp traffic

"[2022-04-06T11:06:24,385][ERROR][logstash.javapipeline ][various_to_splunk_and_s3] Pipeline worker error, the pipeline will be stopped {:pipeline_id=>"various_to_splunk_and_s3", :error=>"(TypeError) no implicit conversion of Hash into String", :exception=>Java::OrgJrubyExceptions::TypeError, :backtrace=>["org.jruby.RubyRegexp.match(org/jruby/RubyRegexp.java:1162)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_filter_minus_dns_minus_3_dot_1_dot_4.lib.logstash.filters.dns.reverse(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-dns-3.1.4/lib/logstash/filters/dns.rb:273)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1821)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_filter_minus_dns_minus_3_dot_1_dot_4.lib.logstash.filters.dns.reverse(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-dns-3.1.4/lib/logstash/filters/dns.rb:255)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_filter_minus_dns_minus_3_dot_1_dot_4.lib.logstash.filters.dns.filter(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-dns-3.1.4/lib/logstash/filters/dns.rb:118)", "usr.share.logstash.logstash_minus_core.lib.logstash.filters.base.do_filter(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:159)", "usr.share.logstash.logstash_minus_core.lib.logstash.filters.base.multi_filter(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:178)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1821)", "usr.share.logstash.logstash_minus_core.lib.logstash.filters.base.multi_filter(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:175)", "org.logstash.config.ir.compiler.AbstractFilterDelegatorExt.multi_filter(org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:134)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:299)"], :thread=>"#<Thread:0x6cb96915 run>"}
[2022-04-06T11:06:45,901][INFO ][logstash.javapipeline ][various_to_splunk_and_s3] Pipeline terminated {"pipeline.id"=>"various_to_splunk_and_s3"}"

Configuration :

filter {
dns {
reverse => [ "host" ]
action => "replace"
failed_cache_size => 1024
failed_cache_ttl => 3600
hit_cache_size => 1024
hit_cache_ttl => 3600
}

Please check and assists on the same to resolve the issue

Steps to reproduce:

Please include a minimal but complete recreation of the problem,
including (e.g.) pipeline definition(s), settings, locale, etc. The easier
you make for us to reproduce it, the more likely that somebody will take the
time to look at it.

  1. Upgrade the logstash from 7.17.0 to 8.0.0
  2. send the UDP traffic that resolve the dns

Provide logs (if relevant):

@yaauie
Copy link
Member

yaauie commented Apr 6, 2022

At first glance, It appears that you are running an ECS pipeline, where host is expected to be an object containing individual properties like ip, name, geo, etc., and it looks like you have upgraded across the 8.0 boundary which made ECS on-by-default, but your filter definition is still expecting the top-level host to be a string value. You likely need to disable ECS compatibility for the pipeline by adding pipeline.ecs_compatibility: disabled to its definition in pipelines.yml, which will cause all plugins in the pipeline to behave as they did in 7.x.

ECS compatibility is now on by default

Many plugins can now be run in a mode that avoids implicit conflict with the Elastic Common Schema. This mode is controlled individually with each plugin’s ecs_compatibility option, which defaults to the value of the Logstash pipeline.ecs_compatibility setting. In Logstash 8, this compatibility mode will be on-by-default for all pipelines. #11623

If you wish to lock in a pipeline’s behaviour from Logstash 7.x before upgrading to Logstash 8, you can set pipeline.ecs_compatibility: disabled to its definition in pipelines.yml (or globally in logstash.yml).
-- Logstash Reference: Breaking changes in 8.0

I have separately opened a PR on the DNS filter to prevent this kind of crash (logstash-plugins/logstash-filter-dns#65).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants