Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Thanos Receive Ingester: In memory representation of metrics distorted during instability. #6265

Closed
vanugrah opened this issue Apr 8, 2023 · 6 comments

Comments

@vanugrah
Copy link
Contributor

vanugrah commented Apr 8, 2023

Thanos, Prometheus and Golang version used:
Thanos: 0.30.2 (using go1.19.5)

Object Storage Provider:
Minio

What happened:
Last weekend we had an incident with our thanos receive cluster that led to incorrect query results during and after the incident. We deploy thanos receive in the dual router/ingester mode and experienced instability on the ingesters causing high error rates on remote write requests. We're still investigating what caused the instability ( the amount of scheduled/running goroutines went above 100k and sustained at that level) - but the effect we noticed is that query results for a particular metric included other metrics altogether. For example:

Screen Shot 2023-04-04 at 11 25 57 AM

I ran the below query at different queriers across our topology and narrowed the issue down specifically to the ingesters:
Screen Shot 2023-04-04 at 11 51 01 AM

Once the ingester stabilized - this behavior persisted i.e querying for a particular metric would return other metrics in the result. This behavior was only observed on one tennant and only occurred for the duration of metrics served by the ingesters (4h) and was not present in the actual TSDB blocks generated and upload to S3. Ultimately we had to prune the affected tennant ( so a fresh TSDB instance would be created) in order to mitigate the issue.

What you expected to happen:
Metrics shouldn't have become intermingled since it's breaking PromQL guarantees. ( We were using the standard prometheus promQL engine).

How to reproduce it (as minimally and precisely as possible):
This is a tough one. The short answer is we don't know. We're still investigating the conditions that caused this behavior. I'll be sure to share more details as they arise. I still felt it was important to share this observation since in the six years I've been working in this field I have never seen something quite like this.

@vanugrah
Copy link
Contributor Author

vanugrah commented Apr 13, 2023

Quick Update here - this is likely related to a combination of thanos receive crashing due to runtime panics and memory snapshotting being enabled on the prometheus TSDB. A similar issue was raised here:
prometheus/prometheus#9725

Although the above issue was addressed in Prometheus version 2.32.0 which corresponds to Prometheus go module version 0.32.0 and we're running thanos version 0.30.2 which uses Prometheus version 0.40.7. So it could be that there are further bugs with memory snapshotting.

From logs we can see there were run time panics that were recovered :

level=error ts=2023-04-12T01:01:26.124608961Z caller=grpc.go:70 component=receive service=gRPC/server component=receive msg="recovered from panic" panic="runtime error: invalid memory address or nil pointer dereference" stack="goroutine 8891 [running]:\nruntime/debug.Stack()\n\t/usr/local/go/src/runtime/debug/stack.go:24 +0x65\ngithub.com/thanos-io/thanos/pkg/server/grpc.New.func1({0x22909c0?, 0x4109510})\n\t/app/pkg/server/grpc/grpc.go:70 +0xda\ngithub.com/grpc-ecosystem/go-grpc-middleware/v2/interceptors/recovery.WithRecoveryHandler.func1.1({0x40d95f?, 0x7f373fc72b38?}, {0x22909c0?, 0x4109510?})\n\t/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/interceptors/recovery/options.go:33 +0x2d\ngithub.com/grpc-ecosystem/go-grpc-middleware/v2/interceptors/recovery.recoverFrom({0x2bf41e8?, 0xc2397f76e0?}, {0x22909c0?, 0x4109510?}, 0xc23fc2cf40?)\n\t/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/interceptors/recovery/interceptors.go:53 +0x36\ngithub.com/grpc-ecosystem/go-grpc-middleware/v2/interceptors/recovery.UnaryServerInterceptor.func1.1()\n\t/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/interceptors/recovery/interceptors.go:27 +0x68\npanic({0x22909c0, 0x4109510})\n\t/usr/local/go/src/runtime/panic.go:884 +0x212\ngithub.com/thanos-io/thanos/pkg/store.(*TSDBStore).TimeRange(0xc23fc2d0e8?)\n\t/app/pkg/store/tsdb.go:117 +0x14\ngithub.com/thanos-io/thanos/pkg/receive.(*localClient).TimeRange(0x0?)\n\t/app/pkg/receive/multitsdb.go:113 +0x1b\ngithub.com/thanos-io/thanos/pkg/store.(*ProxyStore).Info(0xc001188ea0, {0x232f1a0?, 0x2504820?}, 0x8000000000000000?)\n\t/app/pkg/store/proxy.go:152 +0x178\ngithub.com/thanos-io/thanos/pkg/store/storepb._Store_Info_Handler.func1({0x2bf41e8, 0xc2397f7a40}, {0x257be60?, 0x417d880})\n\t/app/pkg/store/storepb/rpc.pb.go:1071 +0x78\ngithub.com/grpc-ecosystem/go-grpc-middleware/v2/interceptors.UnaryServerInterceptor.func1({0x2bf41e8, 0xc2397f7a40}, {0x257be60, 0x417d880}, 0x6b9e1a346?, 0xc23fc1ff38)\n\t/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/interceptors/server.go:22 +0x21e\ngithub.com/grpc-ecosystem/go-grpc-middleware/v2.ChainUnaryServer.func1.1.1({0x2bf41e8?, 0xc2397f7a40?}, {0x257be60?, 0x417d880?})\n\t/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/chain.go:27 +0x3a\ngithub.com/grpc-ecosystem/go-grpc-middleware/v2/interceptors.UnaryServerInterceptor.func1({0x2bf41e8, 0xc2397f7860}, {0x257be60, 0x417d880}, 0x22fe7c0?, 0xc2341e32c0)\n\t/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/interceptors/server.go:22 +0x21e\ngithub.com/thanos-io/thanos/pkg/tracing.UnaryServerInterceptor.func1({0x2bf41e8?, 0xc2397f77a0?}, {0x257be60, 0x417d880}, 0x0?, 0xffffffffffffffff?)\n\t/app/pkg/tracing/grpc.go:30 +0x88\ngithub.com/grpc-ecosystem/go-grpc-middleware/v2.ChainUnaryServer.func1.1.1({0x2bf41e8?, 0xc2397f77a0?}, {0x257be60?, 0x417d880?})\n\t/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/chain.go:27 +0x3a\ngithub.com/grpc-ecosystem/go-grpc-middleware/v2/interceptors.UnaryServerInterceptor.func1({0x2bf41e8, 0xc2397f76e0}, {0x257be60, 0x417d880}, 0x0?, 0xc2341e32e0)\n\t/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/interceptors/server.go:22 +0x21e\ngithub.com/grpc-ecosystem/go-grpc-middleware/v2.ChainUnaryServer.func1.1.1({0x2bf41e8?, 0xc2397f76e0?}, {0x257be60?, 0x417d880?})\n\t/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/chain.go:27 +0x3a\ngithub.com/grpc-ecosystem/go-grpc-prometheus.(*ServerMetrics).UnaryServerInterceptor.func1({0x2bf41e8, 0xc2397f76e0}, {0x257be60, 0x417d880}, 0x7f360c02ce18?, 0xc2341e3300)\n\t/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-prometheus@v1.2.0/server_metrics.go:107 +0x87\ngithub.com/grpc-ecosystem/go-grpc-middleware/v2.ChainUnaryServer.func1.1.1({0x2bf41e8?, 0xc2397f76e0?}, {0x257be60?, 0x417d880?})\n\t/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/chain.go:27 +0x3a\ngithub.com/grpc-ecosystem/go-grpc-middleware/v2/interceptors/recovery.UnaryServerInterceptor.func1({0x2bf41e8?, 0xc2397f76e0?}, {0x257be60?, 0x417d880?}, 0x7f35b0b327f8?, 0xc2341e32a0?)\n\t/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/interceptors/recovery/interceptors.go:31 +0xa7\ngithub.com/grpc-ecosystem/go-grpc-middleware/v2.ChainUnaryServer.func1.1.1({0x2bf41e8?, 0xc2397f76e0?}, {0x257be60?, 0x417d880?})\n\t/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/chain.go:27 +0x3a\ngithub.com/grpc-ecosystem/go-grpc-middleware/v2.ChainUnaryServer.func1({0x2bf41e8, 0xc2397f76e0}, {0x257be60, 0x417d880}, 0x228f400?, 0x26056e0?)\n\t/go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/chain.go:36 +0xbe\ngithub.com/thanos-io/thanos/pkg/store/storepb._Store_Info_Handler({0x2504820?, 0xc00118e7e0}, {0x2bf41e8, 0xc2397f76e0}, 0x26056e0?, 0xc001194fc0)\n\t/app/pkg/store/storepb/rpc.pb.go:1073 +0x126\ngoogle.golang.org/grpc.(*Server).processUnaryRPC(0xc000269dc0, {0x2c06b00, 0xc07d2e4000}, 0xc0199166c0, 0xc0011951a0, 0x411a120, 0x0)\n\t/go/pkg/mod/google.golang.org/grpc@v1.45.0/server.go:1282 +0xccf\ngoogle.golang.org/grpc.(*Server).handleStream(0xc000269dc0, {0x2c06b00, 0xc07d2e4000}, 0xc0199166c0, 0x0)\n\t/go/pkg/mod/google.golang.org/grpc@v1.45.0/server.go:1619 +0xa2f\ngoogle.golang.org/grpc.(*Server).serveStreams.func1.2()\n\t/go/pkg/mod/google.golang.org/grpc@v1.45.0/server.go:921 +0x98\ncreated by google.golang.org/grpc.(*Server).serveStreams.func1\n\t/go/pkg/mod/google.golang.org/grpc@v1.45.0/server.go:919 +0x28a\n" 

However looking at kubernetes pod state we see that the pod exited with panics as well:

    Last State:  Terminated
      Reason:    Error
      Message:   ver).processStreamingRPC(0xc000f70000, {0x2c06b00, 0xc03d9a7860}, 0xc2bc0f70e0, 0xc000f5d0b0, 0x4110de0, 0x0)
                 /go/pkg/mod/google.golang.org/grpc@v1.45.0/server.go:1548 +0xf1b fp=0xc3164ffe48 sp=0xc3164ffb90 pc=0xecfebb
google.golang.org/grpc.(*Server).handleStream(0xc000f70000, {0x2c06b00, 0xc03d9a7860}, 0xc2bc0f70e0, 0x0)
  /go/pkg/mod/google.golang.org/grpc@v1.45.0/server.go:1623 +0x9ea fp=0xc3164fff68 sp=0xc3164ffe48 pc=0xed16ca
google.golang.org/grpc.(*Server).serveStreams.func1.2()
  /go/pkg/mod/google.golang.org/grpc@v1.45.0/server.go:921 +0x98 fp=0xc3164fffe0 sp=0xc3164fff68 pc=0xecac38
runtime.goexit()
  /usr/local/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc3164fffe8 sp=0xc3164fffe0 pc=0x46e341
created by google.golang.org/grpc.(*Server).serveStreams.func1
  /go/pkg/mod/google.golang.org/grpc@v1.45.0/server.go:919 +0x28a

goroutine 471731776 [select]:
runtime.gopark(0xc05d163cf0?, 0x3?, 0x0?, 0x0?, 0xc05d163cd2?)
  /usr/local/go/src/runtime/proc.go:363 +0xd6 fp=0xc05d163b58 sp=0xc05d163b38 pc=0x43cd56
runtime.selectgo(0xc05d163cf0, 0xc05d163ccc, 0x0?, 0x0, 0x0?, 0x1)
  /usr/local/go/src/runtime/select.go:328 +0x7bc fp=0xc05d163c98 sp=0xc05d163b58 pc=0x44cc3c
github.com/thanos-io/thanos/pkg/store/storepb.(*inProcessClientStream).Recv(0xc0a15575f0)
  /app/pkg/store/storepb/inprocess.go:83 +0xbd fp=0xc05d163d30 sp=0xc05d163c98 pc=0x143b03d
github.com/thanos-io/thanos/pkg/store.newLazyRespSet.func1.3(0x0)
  /app/pkg/store/proxy_heap.go:424 +0x3f9 fp=0xc05d163e90 sp=0xc05d163d30 pc=0x1991879
github.com/thanos-io/thanos/pkg/store.newLazyRespSet.func1({0xc24ee9b4a0?, 0x0?}, 0x0?)
  /app/pkg/store/proxy_heap.go:479 +0x229 fp=0xc05d163fb8 sp=0xc05d163e90 pc=0x1991369
github.com/thanos-io/thanos/pkg/store.newLazyRespSet.func2()
  /app/pkg/store/proxy_heap.go:483 +0x32 fp=0xc05d163fe0 sp=0xc05d163fb8 pc=0x1991112
runtime.goexit()
  /usr/local/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc05d163fe8 sp=0xc05d163fe0 pc=0x46e341
created by github.com/thanos-io/thanos/pkg/store.newLazyRespSet
  /app/pkg/store/proxy_heap.go:388 +0x3ca

      Exit Code:    2

So my suspicion is there are two distinct issues. One that causes an unrecoverable panic on thanos receive and secondary issue that causes corruption of the Prometheus index when memory snapshotting is enabled and the TSDB crashes.

@fpetkovski
Copy link
Contributor

fpetkovski commented Apr 13, 2023

Thanks for coming back with more information. The known panics should already be fixed in #6203 and #6271. The first one is already in 0.31 so updating might help you avoid similar issues in the future. We also updated the Prometheus version in the latest release so the snapshotting issue might also be fixed.

@vanugrah
Copy link
Contributor Author

vanugrah commented Apr 13, 2023 via email

@vanugrah
Copy link
Contributor Author

Hey folks - After upgrading, the receive ingesters have been significantly more stable. Though there was still one panic that occurred which caused a crash:

unexpected fault address 0x7f0318c66109
fatal error: fault
[signal SIGSEGV: segmentation violation code=0x1 addr=0x7f0318c66109 pc=0x402ddb]

goroutine 7501757670 [running]:
runtime.throw({0x25e51ec?, 0xc2dfa1f100?})
        /usr/local/go/src/runtime/panic.go:1047 +0x5d fp=0xc2dfa1f080 sp=0xc2dfa1f050 pc=0x439edd
runtime.sigpanic()
        /usr/local/go/src/runtime/signal_unix.go:842 +0x2c5 fp=0xc2dfa1f0d0 sp=0xc2dfa1f080 pc=0x450745
cmpbody()
        /usr/local/go/src/internal/bytealg/compare_amd64.s:53 +0x3b fp=0xc2dfa1f0d8 sp=0xc2dfa1f0d0 pc=0x402ddb
sort.StringSlice.Less(...)
        /usr/local/go/src/sort/sort.go:148
sort.(*StringSlice).Less(0x40b6ca?, 0x18?, 0x21455a0?)
        <autogenerated>:1 +0x65 fp=0xc2dfa1f108 sp=0xc2dfa1f0d8 pc=0x4d6f25
sort.IsSorted({0x2b59700, 0xc1bdebe498})
        /usr/local/go/src/sort/sort.go:102 +0x56 fp=0xc2dfa1f138 sp=0xc2dfa1f108 pc=0x4d2f56
sort.StringsAreSorted(...)
        /usr/local/go/src/sort/sort.go:174
github.com/thanos-io/thanos/pkg/strutil.MergeUnsortedSlices({0xc02d24de00, 0xe, 0x10})
        /app/pkg/strutil/merge.go:28 +0x8b fp=0xc2dfa1f188 sp=0xc2dfa1f138 pc=0x192588b
github.com/thanos-io/thanos/pkg/store.(*ProxyStore).LabelValues(0xc000879c20, {0x2b5a648, 0xc2ecb7ee70}, 0xc2e55fdbd0)
        /app/pkg/store/proxy.go:530 +0x325 fp=0xc2dfa1f320 sp=0xc2dfa1f188 pc=0x1a051e5
github.com/thanos-io/thanos/pkg/store.(*instrumentedStoreServer).LabelValues(0x19?, {0x2b5a648?, 0xc2ecb7ee70?}, 0x40a86d?)
        <autogenerated>:1 +0x34 fp=0xc2dfa1f350 sp=0xc2dfa1f320 pc=0x1a12b54
github.com/thanos-io/thanos/pkg/store.(*limitedStoreServer).LabelValues(0xc2f3082000?, {0x2b5a648?, 0xc2ecb7ee70?}, 0x25e5ea7?)
        <autogenerated>:1 +0x34 fp=0xc2dfa1f380 sp=0xc2dfa1f350 pc=0x1a11794
github.com/thanos-io/thanos/pkg/store.(*ReadWriteTSDBStore).LabelValues(0x40b965?, {0x2b5a648?, 0xc2ecb7ee70?}, 0x8000000000000000?)
        <autogenerated>:1 +0x34 fp=0xc2dfa1f3b0 sp=0xc2dfa1f380 pc=0x1a138b4
github.com/thanos-io/thanos/pkg/store.(*recoverableStoreServer).LabelValues(0x1be3?, {0x2b5a648?, 0xc2ecb7ee70?}, 0xc2e6b6c420?)
        <autogenerated>:1 +0x35 fp=0xc2dfa1f3e0 sp=0xc2dfa1f3b0 pc=0x1a12695
github.com/thanos-io/thanos/pkg/store/storepb._Store_LabelValues_Handler.func1({0x2b5a648, 0xc2ecb7ee70}, {0x24de040?, 0xc2e55fdbd0})
        /app/pkg/store/storepb/rpc.pb.go:1147 +0x78 fp=0xc2dfa1f420 sp=0xc2dfa1f3e0 pc=0x14a8618
github.com/grpc-ecosystem/go-grpc-middleware/v2/interceptors.UnaryServerInterceptor.func1({0x2b5a648, 0xc2ecb7ee70}, {0x24de040, 0xc2e55fdbd0}, 0x15cc49fcedf1c?, 0xc2e6b2e2b8)
        /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/interceptors/server.go:22 +0x21e fp=0xc2dfa1f560 sp=0xc2dfa1f420 pc=0x13c2ede
github.com/grpc-ecosystem/go-grpc-middleware/v2.ChainUnaryServer.func1.1.1({0x2b5a648?, 0xc2ecb7ee70?}, {0x24de040?, 0xc2e55fdbd0?})
        /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/chain.go:27 +0x3a fp=0xc2dfa1f5a0 sp=0xc2dfa1f560 pc=0x14cf6ba
github.com/grpc-ecosystem/go-grpc-middleware/v2/interceptors.UnaryServerInterceptor.func1({0x2b5a648, 0xc2ecb7e930}, {0x24de040, 0xc2e55fdbd0}, 0x225b2a0?, 0xc2dff4bd20)
        /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/interceptors/server.go:22 +0x21e fp=0xc2dfa1f6e0 sp=0xc2dfa1f5a0 pc=0x13c2ede
github.com/thanos-io/thanos/pkg/tracing.UnaryServerInterceptor.func1({0x2b5a648?, 0xc2ecb7e810?}, {0x24de040, 0xc2e55fdbd0}, 0x0?, 0xffffffffffffffff?)
        /app/pkg/tracing/grpc.go:30 +0x88 fp=0xc2dfa1f738 sp=0xc2dfa1f6e0 pc=0x150d788
github.com/grpc-ecosystem/go-grpc-middleware/v2.ChainUnaryServer.func1.1.1({0x2b5a648?, 0xc2ecb7e810?}, {0x24de040?, 0xc2e55fdbd0?})
        /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/chain.go:27 +0x3a fp=0xc2dfa1f778 sp=0xc2dfa1f738 pc=0x14cf6ba
github.com/grpc-ecosystem/go-grpc-middleware/v2/interceptors.UnaryServerInterceptor.func1({0x2b5a648, 0xc2ecb7e720}, {0x24de040, 0xc2e55fdbd0}, 0x0?, 0xc2dff4bf20)
        /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/interceptors/server.go:22 +0x21e fp=0xc2dfa1f8b8 sp=0xc2dfa1f778 pc=0x13c2ede
github.com/grpc-ecosystem/go-grpc-middleware/v2.ChainUnaryServer.func1.1.1({0x2b5a648?, 0xc2ecb7e720?}, {0x24de040?, 0xc2e55fdbd0?})
        /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/chain.go:27 +0x3a fp=0xc2dfa1f8f8 sp=0xc2dfa1f8b8 pc=0x14cf6ba
github.com/grpc-ecosystem/go-grpc-prometheus.(*ServerMetrics).UnaryServerInterceptor.func1({0x2b5a648, 0xc2ecb7e720}, {0x24de040, 0xc2e55fdbd0}, 0x41c206?, 0xc2dff4bf40)
        /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-prometheus@v1.2.0/server_metrics.go:107 +0x87 fp=0xc2dfa1f958 sp=0xc2dfa1f8f8 pc=0x1c878a7
github.com/grpc-ecosystem/go-grpc-middleware/v2.ChainUnaryServer.func1.1.1({0x2b5a648?, 0xc2ecb7e720?}, {0x24de040?, 0xc2e55fdbd0?})
        /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/chain.go:27 +0x3a fp=0xc2dfa1f998 sp=0xc2dfa1f958 pc=0x14cf6ba
github.com/grpc-ecosystem/go-grpc-middleware/v2/interceptors/recovery.UnaryServerInterceptor.func1({0x2b5a648?, 0xc2ecb7e720?}, {0x24de040?, 0xc2e55fdbd0?}, 0x7f027d51c9f8?, 0xc2dff4bac0?)
        /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/interceptors/recovery/interceptors.go:31 +0xa7 fp=0xc2dfa1fa20 sp=0xc2dfa1f998 pc=0x1c8c5e7
github.com/grpc-ecosystem/go-grpc-middleware/v2.ChainUnaryServer.func1.1.1({0x2b5a648?, 0xc2ecb7e720?}, {0x24de040?, 0xc2e55fdbd0?})
        /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/chain.go:27 +0x3a fp=0xc2dfa1fa60 sp=0xc2dfa1fa20 pc=0x14cf6ba
github.com/grpc-ecosystem/go-grpc-middleware/v2.ChainUnaryServer.func1({0x2b5a648, 0xc2ecb7e720}, {0x24de040, 0xc2e55fdbd0}, 0xc27643fb00?, 0x21eb180?)
        /go/pkg/mod/github.com/grpc-ecosystem/go-grpc-middleware/v2@v2.0.0-rc.2.0.20201207153454-9f6bf00c00a7/chain.go:36 +0xbe fp=0xc2dfa1fab8 sp=0xc2dfa1fa60 pc=0x14cf55e
github.com/thanos-io/thanos/pkg/store/storepb._Store_LabelValues_Handler({0x2304200?, 0xc00069a160}, {0x2b5a648, 0xc2ecb7e720}, 0xc2a8a65ec0, 0xc000f13e30)
        /app/pkg/store/storepb/rpc.pb.go:1149 +0x138 fp=0xc2dfa1fb10 sp=0xc2dfa1fab8 pc=0x14a84d8
google.golang.org/grpc.(*Server).processUnaryRPC(0xc00095a1c0, {0x2b6cea0, 0xc0d6845520}, 0xc2fd102ea0, 0xc000f1a030, 0x3ff5d10, 0x0)
        /go/pkg/mod/google.golang.org/grpc@v1.45.0/server.go:1282 +0xccf fp=0xc2dfa1fe48 sp=0xc2dfa1fb10 pc=0xf9de2f
google.golang.org/grpc.(*Server).handleStream(0xc00095a1c0, {0x2b6cea0, 0xc0d6845520}, 0xc2fd102ea0, 0x0)
        /go/pkg/mod/google.golang.org/grpc@v1.45.0/server.go:1619 +0xa2f fp=0xc2dfa1ff68 sp=0xc2dfa1fe48 pc=0xfa246f
google.golang.org/grpc.(*Server).serveStreams.func1.2()
        /go/pkg/mod/google.golang.org/grpc@v1.45.0/server.go:921 +0x98 fp=0xc2dfa1ffe0 sp=0xc2dfa1ff68 pc=0xf9b998
runtime.goexit()
        /usr/local/go/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc2dfa1ffe8 sp=0xc2dfa1ffe0 pc=0x46e101
created by google.golang.org/grpc.(*Server).serveStreams.func1
        /go/pkg/mod/google.golang.org/grpc@v1.45.0/server.go:919 +0x28a

@fpetkovski
Copy link
Contributor

Looks like #6271, but I don't think this is released in 0.31.

@vanugrah
Copy link
Contributor Author

I think you might be right

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants