Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stacktrace when monitoring Elasticsearch with MetricBeat #6192

Closed
beiske opened this issue Jan 26, 2018 · 4 comments
Closed

Stacktrace when monitoring Elasticsearch with MetricBeat #6192

beiske opened this issue Jan 26, 2018 · 4 comments

Comments

@beiske
Copy link
Member

beiske commented Jan 26, 2018

Beats version 6.1.2.
Elasticsearch output cluster version 6.0.1.
Monitored Elasticsearch cluster versions: 2.4.6
Operating system Ubuntu 64 bit.

Config

cat /mnt/data/local-1/allocator/10.0.2.15/state/elasticsearch/admin-console/instance-0000000000/beat-modules.yaml
---
- module: "elasticsearch"
  metricsets:
  - "node"
  - "node_stats"
  period: "10s"
  user: "ec-local-beats-monitor"
  password: "**REDACTED**"
  hosts:
  - "10.0.2.15:18461"
  ssl: false
 cat metricbeat.yml
###################### Metricbeat Configuration  #######################

#==========================  Modules configuration ============================

metricbeat.config.modules:
   path: /mnt/data/local-1/allocator/10.0.2.15/state/elasticsearch/*/*/beat-modules.yaml
  reload.enabled: true
  reload.period: 10s

#==================== Elasticsearch template setting ==========================

setup.template.settings:
  index.number_of_shards: 1
  index.codec: best_compression
  #_source.enabled: false

#================================ General =====================================


cloud.id: "temperature:**REDACTED**"
cloud.auth: "elastic:**REDACTED**"

This results in:

./metricbeat -e
metricbeat2018/01/26 09:29:59.231294 cloudid.go:42: INFO Setting Elasticsearch and Kibana URLs based on the cloud id: output.elasticsearch.hosts=https://****.eu-west-1.aws.found.io:443 and setup.kibana.host=https://****.eu-west-1.aws.found.io:443
2018/01/26 09:29:59.231982 beat.go:436: INFO Home path: [/home/found/metricbeat-6.1.2-linux-x86_64] Config path: [/home/found/metricbeat-6.1.2-linux-x86_64] Data path: [/home/found/metricbeat-6.1.2-linux-x86_64/data] Logs path: [/home/found/metricbeat-6.1.2-linux-x86_64/logs]
2018/01/26 09:29:59.232129 beat.go:443: INFO Beat UUID: 67de923d-31e7-4cf3-bd17-ec6be5f761d5
2018/01/26 09:29:59.232252 beat.go:203: INFO Setup Beat: metricbeat; Version: 6.1.2
2018/01/26 09:29:59.232727 client.go:123: INFO Elasticsearch url: https://87e07353125f11103aa94f02c757e5b0.eu-west-1.aws.found.io:443
2018/01/26 09:29:59.232385 metrics.go:23: INFO Metrics logging every 30s
2018/01/26 09:29:59.233714 module.go:76: INFO Beat name: found
2018/01/26 09:29:59.234393 beat.go:276: INFO metricbeat start running.
2018/01/26 09:29:59.234932 cfgwarn.go:11: WARN BETA: Dynamic config reload is enabled.
2018/01/26 09:29:59.238281 reload.go:127: INFO Config reloader started
2018/01/26 09:30:09.240298 cfgwarn.go:11: WARN BETA: The elasticsearch node metricset is beta
2018/01/26 09:30:09.240606 cfgwarn.go:11: WARN BETA: The elasticsearch node_stats metricset is beta
2018/01/26 09:30:09.240779 reload.go:258: INFO Starting 1 runners ...
2018/01/26 09:30:09.835804 log.go:175: ERR recovered from panic while fetching 'elasticsearch/node' for host '10.0.2.15:18461'. Recovering, but please report this: runtime error: invalid memory address or nil pointer dereference.
2018/01/26 09:30:09.835869 log.go:176: ERR Stacktrace: goroutine 70 [running]:
runtime/debug.Stack(0x2c53620, 0x2b, 0xc4203046e0)
	/usr/local/go/src/runtime/debug/stack.go:24 +0xa7
github.com/elastic/beats/libbeat/logp.Recover(0xc42001c720, 0x53)
	/go/src/github.com/elastic/beats/libbeat/logp/log.go:176 +0x12f
panic(0x28d8cc0, 0x3be95c0)
	/usr/local/go/src/runtime/panic.go:491 +0x283
github.com/elastic/beats/metricbeat/helper.(*HTTP).FetchResponse(0x0, 0x0, 0x0, 0xf2)
	/go/src/github.com/elastic/beats/metricbeat/helper/http.go:77 +0x3a
github.com/elastic/beats/metricbeat/helper.(*HTTP).FetchContent(0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/go/src/github.com/elastic/beats/metricbeat/helper/http.go:116 +0x7c
github.com/elastic/beats/metricbeat/module/elasticsearch/node.(*MetricSet).Fetch(0xc42019e2d0, 0x2, 0x2, 0x2, 0x0, 0xc420304bc8)
	/go/src/github.com/elastic/beats/metricbeat/module/elasticsearch/node/node.go:48 +0x36
github.com/elastic/beats/metricbeat/mb/module.(*metricSetWrapper).multiEventFetch(0xc420118f80, 0x7f58dcba2950, 0xc42019e2d0, 0x3c18a00, 0xc4201600c0)
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:215 +0x4f
github.com/elastic/beats/metricbeat/mb/module.(*metricSetWrapper).fetch(0xc420118f80, 0x3c18a00, 0xc4201600c0)
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:199 +0x112
github.com/elastic/beats/metricbeat/mb/module.(*metricSetWrapper).startPeriodicFetching(0xc420118f80, 0x3c18a00, 0xc4201600c0)
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:176 +0x5d
github.com/elastic/beats/metricbeat/mb/module.(*metricSetWrapper).run(0xc420118f80, 0xc42001c6c0, 0xc420529620)
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:163 +0x59a
github.com/elastic/beats/metricbeat/mb/module.(*Wrapper).Start.func1(0xc420200830, 0xc42001c6c0, 0xc420529620, 0xc420118f80)
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:107 +0xbd
created by github.com/elastic/beats/metricbeat/mb/module.(*Wrapper).Start
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:103 +0x146
2018/01/26 09:30:12.495242 log.go:175: ERR recovered from panic while fetching 'elasticsearch/node_stats' for host '10.0.2.15:18461'. Recovering, but please report this: runtime error: invalid memory address or nil pointer dereference.
2018/01/26 09:30:12.495325 log.go:176: ERR Stacktrace: goroutine 71 [running]:
runtime/debug.Stack(0x2c53620, 0x2b, 0xc4203056e0)
	/usr/local/go/src/runtime/debug/stack.go:24 +0xa7
github.com/elastic/beats/libbeat/logp.Recover(0xc42001c780, 0x59)
	/go/src/github.com/elastic/beats/libbeat/logp/log.go:176 +0x12f
panic(0x28d8cc0, 0x3be95c0)
	/usr/local/go/src/runtime/panic.go:491 +0x283
github.com/elastic/beats/metricbeat/helper.(*HTTP).FetchResponse(0x0, 0x0, 0x0, 0xf2)
	/go/src/github.com/elastic/beats/metricbeat/helper/http.go:77 +0x3a
github.com/elastic/beats/metricbeat/helper.(*HTTP).FetchContent(0x0, 0x0, 0x0, 0x0, 0x0, 0x0)
	/go/src/github.com/elastic/beats/metricbeat/helper/http.go:116 +0x7c
github.com/elastic/beats/metricbeat/module/elasticsearch/node_stats.(*MetricSet).Fetch(0xc42019e360, 0x2, 0x2, 0x2, 0x0, 0xc420305bc8)
	/go/src/github.com/elastic/beats/metricbeat/module/elasticsearch/node_stats/node_stats.go:46 +0x36
github.com/elastic/beats/metricbeat/mb/module.(*metricSetWrapper).multiEventFetch(0xc420118fa0, 0x7f58dcba2a70, 0xc42019e360, 0x3c18a00, 0xc420278180)
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:215 +0x4f
github.com/elastic/beats/metricbeat/mb/module.(*metricSetWrapper).fetch(0xc420118fa0, 0x3c18a00, 0xc420278180)
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:199 +0x112
github.com/elastic/beats/metricbeat/mb/module.(*metricSetWrapper).startPeriodicFetching(0xc420118fa0, 0x3c18a00, 0xc420278180)
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:176 +0x5d
github.com/elastic/beats/metricbeat/mb/module.(*metricSetWrapper).run(0xc420118fa0, 0xc42001c6c0, 0xc420529620)
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:163 +0x59a
github.com/elastic/beats/metricbeat/mb/module.(*Wrapper).Start.func1(0xc420200830, 0xc42001c6c0, 0xc420529620, 0xc420118fa0)
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:107 +0xbd
created by github.com/elastic/beats/metricbeat/mb/module.(*Wrapper).Start
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:103 +0x146
@beiske beiske added the bug label Jan 26, 2018
@ruflin
Copy link
Member

ruflin commented Jan 28, 2018

@beiske I managed to reproduce this and it's related to ssl: false. If you remove that from the config it should work as expected. Will further investigate what causes this.

@ruflin
Copy link
Member

ruflin commented Jan 28, 2018

@beiske The issue is that ssl is not a valid config. ssl.enabled: false is what you are looking for. Still a wrong config should show an error and not make metricbeat panic. I will create a follow up PR to fix this but renaming the config on your end should solve the issue for you.

@ruflin
Copy link
Member

ruflin commented Jan 28, 2018

PR with a fix is here: #6205

@beiske
Copy link
Member Author

beiske commented Jan 30, 2018

@ruflin Thanks!

ruflin added a commit to ruflin/beats that referenced this issue Jan 31, 2018
When setting an invalide http config in a metricbeat module / metricset it could happen that the metricset paniced. The reason is that the error when unpacking the config was not properly returned and handled which lead to an empty http instance.

This PR changes the behaviour to return the errors which can then be handled by the metricsets. The issue also affects all metricsets which were based on the prometheus helper.

Closes elastic#6192
exekias pushed a commit that referenced this issue Jan 31, 2018
When setting an invalide http config in a metricbeat module / metricset it could happen that the metricset paniced. The reason is that the error when unpacking the config was not properly returned and handled which lead to an empty http instance.

This PR changes the behaviour to return the errors which can then be handled by the metricsets. The issue also affects all metricsets which were based on the prometheus helper.

Closes #6192
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants