Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mericbeat process module crash #6692

Closed
arkadicolson opened this issue Mar 29, 2018 · 3 comments
Closed

Mericbeat process module crash #6692

arkadicolson opened this issue Mar 29, 2018 · 3 comments
Labels

Comments

@arkadicolson
Copy link

arkadicolson commented Mar 29, 2018

As requested in ticket #5337 (comment) I have opened a new one.

After some time metricbeat stops shipping process related data. A restart fixes this issue temporary. We are having this issue on debian 7 and 8 with metricbeat 6.2.2
Please find the stack trace below...

2018-03-26T01:54:52.640+0200 ERROR ioutil/ioutil.go:30 recovered from panic while fetching 'system/process' for host ''. Recovering, but please report this: runtime error: slice bounds out of range.

{"stack": "github.com/elastic/beats/libbeat/logp.Recover
	/go/src/github.com/elastic/beats/libbeat/logp/global.go:88
runtime.call32
	/usr/local/go/src/runtime/asm_amd64.s:509
runtime.gopanic
	/usr/local/go/src/runtime/panic.go:491
io/ioutil.readAll.func1
	/usr/local/go/src/io/ioutil/ioutil.go:30
runtime.call32
	/usr/local/go/src/runtime/asm_amd64.s:509
runtime.gopanic
	/usr/local/go/src/runtime/panic.go:491
runtime.panicslice
	/usr/local/go/src/runtime/panic.go:35
bytes.(*Buffer).ReadFrom
	/usr/local/go/src/bytes/buffer.go:210
io/ioutil.readAll
	/usr/local/go/src/io/ioutil/ioutil.go:33
io/ioutil.ReadFile
	/usr/local/go/src/io/ioutil/ioutil.go:70
github.com/elastic/beats/vendor/github.com/elastic/gosigar.readProcFile
	/go/src/github.com/elastic/beats/vendor/github.com/elastic/gosigar/sigar_linux_common.go:425
github.com/elastic/beats/vendor/github.com/elastic/gosigar.(*ProcEnv).Get
	/go/src/github.com/elastic/beats/vendor/github.com/elastic/gosigar/sigar_linux_common.go:325
github.com/elastic/beats/libbeat/metric/system/process.getProcEnv
	/go/src/github.com/elastic/beats/libbeat/metric/system/process/process.go:177
github.com/elastic/beats/libbeat/metric/system/process.(*Process).getDetails
	/go/src/github.com/elastic/beats/libbeat/metric/system/process/process.go:132
github.com/elastic/beats/libbeat/metric/system/process.(*Stats).getSingleProcess
	/go/src/github.com/elastic/beats/libbeat/metric/system/process/process.go:452
github.com/elastic/beats/libbeat/metric/system/process.(*Stats).Get
	/go/src/github.com/elastic/beats/libbeat/metric/system/process/process.go:393
github.com/elastic/beats/metricbeat/module/system/process.(*MetricSet).Fetch
	/go/src/github.com/elastic/beats/metricbeat/module/system/process/process.go:83
github.com/elastic/beats/metricbeat/mb/module.(*metricSetWrapper).multiEventFetch
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:227
github.com/elastic/beats/metricbeat/mb/module.(*metricSetWrapper).fetch
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:207
github.com/elastic/beats/metricbeat/mb/module.(*metricSetWrapper).startPeriodicFetching
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:194
github.com/elastic/beats/metricbeat/mb/module.(*metricSetWrapper).run
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:171
github.com/elastic/beats/metricbeat/mb/module.(*Wrapper).Start.func1
	/go/src/github.com/elastic/beats/metricbeat/mb/module/wrapper.go:112"}
@jsoriano
Copy link
Member

jsoriano commented Mar 29, 2018

This is weird, because the panic seems to happen in ioutil.ReadFile, that should be safe.

@arkadicolson can you think on anything that can make your scenario a bit special? are the machines VMs? do they have very high memory or I/O pressure? do you run metricbeat with memory limits?

@jsoriano
Copy link
Member

This issue seems related: golang/go#22097

jsoriano added a commit to jsoriano/gosigar that referenced this issue Mar 29, 2018
Panics have been reported when reading from proc files, this shouldn't
happen, but while root cause is found we can workaround the issue
recovering from the panic and reporting the specific procfile whose
read provoked the panic.

See elastic/beats#6692
ruflin pushed a commit to elastic/gosigar that referenced this issue Mar 30, 2018
Panics have been reported when reading from proc files, this shouldn't
happen, but while root cause is found we can workaround the issue
recovering from the panic and reporting the specific procfile whose
read provoked the panic.

See elastic/beats#6692
@arkadicolson
Copy link
Author

Hi
We installed the new 6.2.2 recently but the package installed did not trigger a restart. After a manually restart the problem seems to be resolved.
Br,
Arkadi

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants