Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

'invalid memory address or nil pointer dereference' error in Datadog Logs exporter #16077

Closed
julealgon opened this issue Nov 4, 2022 · 2 comments · Fixed by #16390
Closed
Assignees
Labels
bug Something isn't working data:logs Logs related issues exporter/datadog Datadog components priority:p1 High

Comments

@julealgon
Copy link

Component(s)

exporter/datadog

What happened?

Description

invalid memory address or nil pointer dereference runtime error happened for no apparent reason in the Datadog Logs exporter as per the trace info.

Steps to Reproduce

Unknown (we have the collector running in the VM and when I logged into the machine today to check something else, it was stopped with this error in the console).

Expected Result

Collector should run continuously without ever producing errors like this.

Actual Result

Collector completely halted execution.

Collector version

0.62.1

Environment information

Environment

Azure Windows VM

OpenTelemetry Collector configuration

receivers:
  otlp:
    protocols:
      grpc:

  hostmetrics:
    collection_interval: 10s
    scrapers:
      paging:
        metrics:
          system.paging.utilization:
            enabled: true
      cpu:
        metrics:
          system.cpu.utilization:
            enabled: true
      disk:
      filesystem:
        metrics:
          system.filesystem.utilization:
            enabled: true
      load:
      memory:
      network:
      processes:

processors:
  batch:
    timeout: 10s

  resource:
    attributes:
    - key: deployment.environment
      value: "beta"
      action: upsert

exporters:
  datadog:
    api:
      key: ${DATADOG_APIKEY}

service:
  pipelines:
    traces:
      receivers: [otlp]
      processors: [resource, batch]
      exporters: [datadog]
    metrics:
      receivers: [hostmetrics, otlp]
      processors: [resource, batch]
      exporters: [datadog]
    logs:
      receivers: [otlp]
      processors: [resource, batch]
      exporters: [datadog]

Log output

2022-10-27T07:53:56.703Z        info    metadata/metadata.go:216        Sent host metadata      {"kind": "exporter", "data_type": "metrics", "name": "datadog"}
panic: runtime error: invalid memory address or nil pointer dereference
[signal 0xc0000005 code=0x0 addr=0x40 pc=0x29487f6]

goroutine 165 [running]:
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter/internal/logs.(*Sender).SubmitLogs(0xc0008278e0, {0x78ea758, 0xc00115bd00}, {0xc001f93b00, 0x2, 0x2})
        github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter@v0.62.0/internal/logs/sender.go:67 +0x236
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter.(*logsExporter).consumeLogs(0xc000bf1b80, {0x6cbc76e?, 0x10?}, {0xc001f93a80?})
        github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter@v0.62.0/logs_exporter.go:101 +0x16c
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsRequest).Export(0x2ed8f1e?, {0x78ea800?, 0xc002023b00?})
        go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/logs.go:65 +0x34
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send(0xc000962490, {0x7907d80, 0xc000b90c30})
        go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/common.go:203 +0x96
go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send(0xc00069d440, {0x7907d80, 0xc000b90c30})
        go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/queued_retry.go:388 +0x58d
go.opentelemetry.io/collector/exporter/exporterhelper.(*logsExporterWithObservability).send(0xc000c62198, {0x7907d80, 0xc000b90c30})
        go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/logs.go:132 +0x88
go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).start.func1({0x7907d80, 0xc000b90c30})
        go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/queued_retry.go:206 +0x39
go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).StartConsumers.func1()
        go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/internal/bounded_memory_queue.go:61 +0xb6
created by go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).StartConsumers
        go.opentelemetry.io/collector@v0.62.1/exporter/exporterhelper/internal/bounded_memory_queue.go:56 +0x45

Additional context

No response

@julealgon julealgon added bug Something isn't working needs triage New item requiring triage labels Nov 4, 2022
@github-actions github-actions bot added the exporter/datadog Datadog components label Nov 4, 2022
@github-actions
Copy link
Contributor

github-actions bot commented Nov 4, 2022

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

@mx-psi mx-psi added data:logs Logs related issues priority:p1 High and removed needs triage New item requiring triage labels Nov 4, 2022
@mx-psi
Copy link
Member

mx-psi commented Nov 4, 2022

@dineshg13 I would assume this happens because r is nil here:

since the submission returned an error. Could you have a look?

I think we can just get rid of reading the body. It would also be nice to have a unit test for this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working data:logs Logs related issues exporter/datadog Datadog components priority:p1 High
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants