Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[OTEL Infra Host Mapping ]datadog.host.use_as_metadata causes Panic #306

Closed
atmask opened this issue Mar 22, 2024 · 1 comment · Fixed by #323 or open-telemetry/opentelemetry-collector-contrib#32865

Comments

@atmask
Copy link

atmask commented Mar 22, 2024

Problem When enabling the OTEL Infra Host Mapping functionality the OTEL Collector is thrown into a panic that originates from this module.

Config:

processors:
  transform/datadog:
    metric_statements: &statements
      - context: resource
        statements:
          - set(attributes["datadog.host.use_as_metadata"], true)
          - set(attributes["datadog.host.name"], <name>)
    log_statements: *statements
    trace_statements: *statements

exporters:
  datadog:
    hostname: <also try setting here>
    api: <ommiting>

Full Trace

panic: assignment to entry in nil map

goroutine 181 [running]:
github.com/DataDog/opentelemetry-mapping-go/pkg/inframetadata/internal/hostmap.(*HostMap).Update(0xc002e96e20, {0xc00252c570, 0x24}, {0xc003ff14b8?, 0xc003fa163c?})
	github.com/DataDog/opentelemetry-mapping-go/pkg/inframetadata@v0.13.4/internal/hostmap/hostmap.go:243 +0x619
github.com/DataDog/opentelemetry-mapping-go/pkg/inframetadata.(*Reporter).ConsumeResource(0xc002eb5290, {0xc003ff14b8?, 0xc003fa163c?})
	github.com/DataDog/opentelemetry-mapping-go/pkg/inframetadata@v0.13.4/reporter.go:137 +0x125
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter.consumeResource(0xc003b1d870?, {0xc003ff14b8, 0xc003fa163c}, 0x407985?)
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter@v0.96.0/factory.go:73 +0x37
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter.(*metricsExporter).PushMetricsData(0xc002e8d560, {0x9216420, 0xc0022a32c0}, {0xc003fb3a10?, 0xc003fa163c?})
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter@v0.96.0/metrics_exporter.go:203 +0x12d3
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter.(*metricsExporter).PushMetricsDataScrubbed(0xc002e8d560, {0x9216420?, 0xc0022a32c0?}, {0xc003fb3a10?, 0xc003fa163c?})
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter@v0.96.0/metrics_exporter.go:185 +0x2c
go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsRequest).Export(0x0?, {0x9216420?, 0xc0022a32c0?})
	go.opentelemetry.io/collector/exporter@v0.96.0/exporterhelper/metrics.go:59 +0x31
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send(0xc00274ec78?, {0x9216420?, 0xc0022a32c0?}, {0x91cd7f0?, 0xc002e7cca8?})
	go.opentelemetry.io/collector/exporter@v0.96.0/exporterhelper/timeout_sender.go:43 +0x48
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send(0xc002e86380?, {0x9216420?, 0xc0022a32c0?}, {0x91cd7f0?, 0xc002e7cca8?})
	go.opentelemetry.io/collector/exporter@v0.96.0/exporterhelper/common.go:35 +0x30
go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsSenderWithObservability).send(0xc002eb5da0, {0x92168d0?, 0xc001d7df50?}, {0x91cd7f0?, 0xc002e7cca8?})
	go.opentelemetry.io/collector/exporter@v0.96.0/exporterhelper/metrics.go:155 +0x7e
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1({0x92168d0?, 0xc001d7df50?}, {0x91cd7f0?, 0xc002e7cca8?})
	go.opentelemetry.io/collector/exporter@v0.96.0/exporterhelper/queue_sender.go:95 +0x84
go.opentelemetry.io/collector/exporter/internal/queue.(*boundedMemoryQueue[...]).Consume(0x9221bc0, 0xc002eb5e00)
	go.opentelemetry.io/collector/exporter@v0.96.0/internal/queue/bounded_memory_queue.go:57 +0xc7
go.opentelemetry.io/collector/exporter/internal/queue.(*Consumers[...]).Start.func1()
	go.opentelemetry.io/collector/exporter@v0.96.0/internal/queue/consumers.go:43 +0x79
created by go.opentelemetry.io/collector/exporter/internal/queue.(*Consumers[...]).Start in goroutine 1
	go.opentelemetry.io/collector/exporter@v0.96.0/internal/queue/consumers.go:39 +0x7d
panic: assignment to entry in nil map

goroutine 124 [running]:
github.com/DataDog/opentelemetry-mapping-go/pkg/inframetadata/internal/hostmap.(*HostMap).Update(0xc002e96e20, {0xc00252c570, 0x24}, {0xc003fb59f8?, 0xc002253790?})
	github.com/DataDog/opentelemetry-mapping-go/pkg/inframetadata@v0.13.4/internal/hostmap/hostmap.go:243 +0x619
github.com/DataDog/opentelemetry-mapping-go/pkg/inframetadata.(*Reporter).ConsumeResource(0xc002eb5290, {0xc003fb59f8?, 0xc002253790?})
	github.com/DataDog/opentelemetry-mapping-go/pkg/inframetadata@v0.13.4/reporter.go:137 +0x125
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter.consumeResource(0xc0034677f8?, {0xc003fb59f8, 0xc002253790}, 0xc0034679b8?)
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter@v0.96.0/factory.go:73 +0x37
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter.(*metricsExporter).PushMetricsData(0xc002e8d560, {0x9216420, 0xc002cf2f60}, {0xc002e7cb70?, 0xc002253790?})
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter@v0.96.0/metrics_exporter.go:203 +0x12d3
github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter.(*metricsExporter).PushMetricsDataScrubbed(0xc002e8d560, {0x9216420?, 0xc002cf2f60?}, {0xc002e7cb70?, 0xc002253790?})
	github.com/open-telemetry/opentelemetry-collector-contrib/exporter/datadogexporter@v0.96.0/metrics_exporter.go:185 +0x2c
go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsRequest).Export(0x0?, {0x9216420?, 0xc002cf2f60?})
	go.opentelemetry.io/collector/exporter@v0.96.0/exporterhelper/metrics.go:59 +0x31
go.opentelemetry.io/collector/exporter/exporterhelper.(*timeoutSender).send(0xc002a76390?, {0x9216420?, 0xc002cf2f60?}, {0x91cd7f0?, 0xc002e7cc00?})
	go.opentelemetry.io/collector/exporter@v0.96.0/exporterhelper/timeout_sender.go:43 +0x48
go.opentelemetry.io/collector/exporter/exporterhelper.(*baseRequestSender).send(0xc002e86380?, {0x9216420?, 0xc002cf2f60?}, {0x91cd7f0?, 0xc002e7cc00?})
	go.opentelemetry.io/collector/exporter@v0.96.0/exporterhelper/common.go:35 +0x30
go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsSenderWithObservability).send(0xc002eb5da0, {0x92168d0?, 0xc001d7dd00?}, {0x91cd7f0?, 0xc002e7cc00?})
	go.opentelemetry.io/collector/exporter@v0.96.0/exporterhelper/metrics.go:155 +0x7e
go.opentelemetry.io/collector/exporter/exporterhelper.newQueueSender.func1({0x92168d0?, 0xc001d7dd00?}, {0x91cd7f0?, 0xc002e7cc00?})
	go.opentelemetry.io/collector/exporter@v0.96.0/exporterhelper/queue_sender.go:95 +0x84
go.opentelemetry.io/collector/exporter/internal/queue.(*boundedMemoryQueue[...]).Consume(0x9221bc0, 0xc002eb5e00)
	go.opentelemetry.io/collector/exporter@v0.96.0/internal/queue/bounded_memory_queue.go:57 +0xc7
go.opentelemetry.io/collector/exporter/internal/queue.(*Consumers[...]).Start.func1()
	go.opentelemetry.io/collector/exporter@v0.96.0/internal/queue/consumers.go:43 +0x79
created by go.opentelemetry.io/collector/exporter/internal/queue.(*Consumers[...]).Start in goroutine 1
	go.opentelemetry.io/collector/exporter@v0.96.0/internal/queue/consumers.go:39 +0x7d

Investigation: It looks like this is originating from the ConsumeResource call on the Reporter struct. This struct appear to have an uninitialized hostmap

@mx-psi
Copy link
Member

mx-psi commented Apr 10, 2024

@atmask Apologies for the delay in replying here, we don't monitor the issues on this repository very frequently. Would you mind filing a support ticket with Datadog (see https://www.datadoghq.com/support/)? You can mention this issue and ping me directly

mx-psi added a commit to open-telemetry/opentelemetry-collector-contrib that referenced this issue May 6, 2024
**Description:** <Describe what has changed.>
<!--Ex. Fixing a bug - Describe the bug and how this fixes the issue.
Ex. Adding a feature - Explain what this achieves.-->

Update opentelemetry-mapping-go to v0.16.0

**Link to tracking Issue:** Fixes
DataDog/opentelemetry-mapping-go/issues/306
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants