Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable Jaeger Service Performance Management (SPM) experimental feature #20

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

ramonguiu
Copy link
Contributor

Configure Jaeger and the OTel Collector to enable Jaeger SPM experimental feature

@ramonguiu ramonguiu requested a review from jgpruitt as a code owner June 25, 2022 15:05
@ramonguiu
Copy link
Contributor Author

ramonguiu commented Jun 25, 2022

I am not 100% sure we should merge it. The spanmetrics processor crashes from time to time and the otel collector needs to restart. I am not sure what's causing it.

panic: runtime error: index out of range [17] with length 17 [recovered]
	panic: runtime error: index out of range [17] with length 17

goroutine 172 [running]:
go.opentelemetry.io/otel/sdk/trace.(*recordingSpan).End.func1()
	go.opentelemetry.io/otel/sdk@v1.7.0/trace/span.go:359 +0x2a
go.opentelemetry.io/otel/sdk/trace.(*recordingSpan).End(0xc001d56180, {0x0, 0x0, 0x403be6?})
	go.opentelemetry.io/otel/sdk@v1.7.0/trace/span.go:398 +0x8dd
panic({0x483d560, 0xc001d7c390})
	runtime/panic.go:838 +0x207
github.com/open-telemetry/opentelemetry-collector-contrib/processor/spanmetricsprocessor.(*processorImp).updateLatencyMetrics(...)
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/spanmetricsprocessor@v0.54.0/processor.go:442
github.com/open-telemetry/opentelemetry-collector-contrib/processor/spanmetricsprocessor.(*processorImp).aggregateMetricsForSpan(0xc000574780, {0xc001c169bb, 0x5}, {0x0?}, {0x8?})
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/spanmetricsprocessor@v0.54.0/processor.go:395 +0x466
github.com/open-telemetry/opentelemetry-collector-contrib/processor/spanmetricsprocessor.(*processorImp).aggregateMetricsForServiceSpans(0x0?, {0x4bde08a?}, {0xc001c169bb, 0x5})
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/spanmetricsprocessor@v0.54.0/processor.go:379 +0x7d
github.com/open-telemetry/opentelemetry-collector-contrib/processor/spanmetricsprocessor.(*processorImp).aggregateMetrics(0x4?, {0x0?})
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/spanmetricsprocessor@v0.54.0/processor.go:368 +0xd1
github.com/open-telemetry/opentelemetry-collector-contrib/processor/spanmetricsprocessor.(*processorImp).ConsumeTraces(0xc000574780, {0x54e0000, 0xc001c13aa0}, {0x0?})
	github.com/open-telemetry/opentelemetry-collector-contrib/processor/spanmetricsprocessor@v0.54.0/processor.go:233 +0x34
go.opentelemetry.io/collector/receiver/otlpreceiver/internal/trace.(*Receiver).Export(0xc000b00150, {0x54e0000, 0xc001c13a10}, {0xc000181c80?})
	go.opentelemetry.io/collector@v0.54.0/receiver/otlpreceiver/internal/trace/otlp.go:60 +0xd3
go.opentelemetry.io/collector/pdata/ptrace/ptraceotlp.rawTracesServer.Export({{0x54af3e0?, 0xc000b00150?}}, {0x54e0000, 0xc001c13a10}, 0xc001c50840)
	go.opentelemetry.io/collector/pdata@v0.54.0/ptrace/ptraceotlp/traces.go:167 +0xff
go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/trace/v1._TraceService_Export_Handler.func1({0x54e0000, 0xc001c13a10}, {0x4991820?, 0xc001c50840})
	go.opentelemetry.io/collector/pdata@v0.54.0/internal/data/protogen/collector/trace/v1/trace_service.pb.go:216 +0x78
go.opentelemetry.io/collector/config/configgrpc.enhanceWithClientInformation.func1({0x54e0000?, 0xc001c139b0?}, {0x4991820, 0xc001c50840}, 0x0?, 0xc001c50858)
	go.opentelemetry.io/collector@v0.54.0/config/configgrpc/configgrpc.go:385 +0x4c
google.golang.org/grpc.chainUnaryInterceptors.func1.1({0x54e0000?, 0xc001c139b0?}, {0x4991820?, 0xc001c50840?})
	google.golang.org/grpc@v1.47.0/server.go:1117 +0x5b
go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc.UnaryServerInterceptor.func1({0x54e0000, 0xc001c138c0}, {0x4991820, 0xc001c50840}, 0xc001d54000, 0xc001d3f400)
	go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc@v0.32.0/interceptor.go:325 +0x664
google.golang.org/grpc.chainUnaryInterceptors.func1.1({0x54e0000?, 0xc001c138c0?}, {0x4991820?, 0xc001c50840?})
	google.golang.org/grpc@v1.47.0/server.go:1120 +0x83
google.golang.org/grpc.chainUnaryInterceptors.func1({0x54e0000, 0xc001c138c0}, {0x4991820, 0xc001c50840}, 0xc001d54000, 0xc001c50858)
	google.golang.org/grpc@v1.47.0/server.go:1122 +0x12b
go.opentelemetry.io/collector/pdata/internal/data/protogen/collector/trace/v1._TraceService_Export_Handler({0x41e2900?, 0xc00051b200}, {0x54e0000, 0xc001c138c0}, 0xc000323200, 0xc000691bc0)
	go.opentelemetry.io/collector/pdata@v0.54.0/internal/data/protogen/collector/trace/v1/trace_service.pb.go:218 +0x138
google.golang.org/grpc.(*Server).processUnaryRPC(0xc0006d7340, {0x54f4bb8, 0xc000ac8b60}, 0xc00076de60, 0xc000833ec0, 0x7dd94d0, 0x0)
	google.golang.org/grpc@v1.47.0/server.go:1283 +0xcfd
google.golang.org/grpc.(*Server).handleStream(0xc0006d7340, {0x54f4bb8, 0xc000ac8b60}, 0xc00076de60, 0x0)
	google.golang.org/grpc@v1.47.0/server.go:1620 +0xa1b
google.golang.org/grpc.(*Server).serveStreams.func1.2()
	google.golang.org/grpc@v1.47.0/server.go:922 +0x98
created by google.golang.org/grpc.(*Server).serveStreams.func1
	google.golang.org/grpc@v1.47.0/server.go:920 +0x28a

@janssenlima
Copy link

I updated the promscale version to 0.14.0 and didn't have this problem.

Copy link

@janssenlima janssenlima left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please update promscale version to 0.14.0.

@jgpruitt jgpruitt removed their request for review February 6, 2023 15:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants