Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Metrics exporting issue, unknown field "doubleSum" in v1.Metric #707

Closed
mentos1386 opened this issue Nov 3, 2021 · 5 comments
Closed

Metrics exporting issue, unknown field "doubleSum" in v1.Metric #707

mentos1386 opened this issue Nov 3, 2021 · 5 comments
Labels
metrics Metrics related issue otel-bug OTEL upstream issues and bugs question Further information is requested

Comments

@mentos1386
Copy link

Describe the bug
Unable to configure nodejs to send metrics to collector, which should send metrics to cloudwatch.

I am receiving the following error:

 items to be sent [
   {
     descriptor: {
       name: 'payload',
       description: 'Metric for counting request payload size',
       unit: '1',
       metricKind: 0,
       valueType: 1
     },
     labels: { pid: '160', env: 'beta' },
     aggregator: SumAggregator { kind: 0, _current: 119, _lastUpdateTime: [Array] },
     aggregationTemporality: 2,
     resource: Resource { attributes: [Object] },
     instrumentationLibrary: { name: 'test', version: undefined }
   },
   {
     descriptor: {
       name: 'activeRequest',
       description: 'Metric for record active requests',
       unit: '1',
       metricKind: 1,
       valueType: 1
     },
     labels: { pid: '160', env: 'beta' },
     aggregator: SumAggregator { kind: 0, _current: -13, _lastUpdateTime: [Array] },
     aggregationTemporality: 2,
     resource: Resource { attributes: [Object] },
     instrumentationLibrary: { name: 'test', version: undefined }
   },
   {
     descriptor: {
       name: 'latency',
       description: 'Metric for record request latency',
       unit: '1',
       metricKind: 2,
       valueType: 1
     },
     labels: { pid: '160', env: 'beta' },
     aggregator: HistogramAggregator {
       kind: 2,
       _boundaries: [Array],
       _current: [Object],
       _lastUpdateTime: [Array]
     },
     aggregationTemporality: 2,
     resource: Resource { attributes: [Object] },
     instrumentationLibrary: { name: 'test', version: undefined }
   }
 ]
 {"stack":"CollectorExporterError: Bad Request\n    at IncomingMessage.<anonymous> (/service/node_modules/@opentelemetry/exporter-collector/src/platform/node/util.ts:69:23)\n    at IncomingMessage.emit (node:events:402:35)\n    at IncomingMessage.emit (node:domain:475:12)\n    at endReadableNT (node:internal/streams/readable:1343:12)\n    at processTicksAndRejections (node:internal/process/task_queues:83:21)","message":"Bad Request","name":"CollectorExporterError","data":"{\"code\":3,\"message\":\"unknown field \\\"doubleSum\\\" in v1.Metric\"}","code":"400"}
 metrics
 items to be sent []
 statusCode: 200 {}

interesting is that when "items to be sent" is empty array, then it works. But, if there are metrics inside, then it doesn't.

Steps to reproduce

  diag.setLogger(new DiagConsoleLogger(), DiagLogLevel.DEBUG);

  const sdk = new NodeSDK({
    traceExporter: new CollectorTraceExporter(),
    metricExporter: new CollectorMetricExporter(),
  });

  await sdk.start();

  const meter = metrics.getMeter('test');

  /** Counter Metrics */
  const payloadMetric = meter.createCounter('payload', {
    description: 'Metric for counting request payload size',
  });

  /** Up and Down Counter Metrics */
  const activeReqMetric = meter.createUpDownCounter('activeRequest', {
    description: 'Metric for record active requests',
  });

  /** Value Recorder Metrics with Histogram */
  const requestLatency = meter.createValueRecorder('latency', {
    description: 'Metric for record request latency',
  });

  /** Define Metrics Dimensions */
  const labels = { pid: process.pid.toString(), env: 'beta' };

  setInterval(() => {
    console.log('metrics');
    payloadMetric.bind(labels).add(1);
    activeReqMetric.bind(labels).add(Math.random() > 0.5 ? 1 : -1);
    requestLatency.bind(labels).record(Math.random() * 1000);
  }, 1000);

With the following versions:

    "@opentelemetry/api": "^1.0.3",
    "@opentelemetry/api-metrics": "^0.25.0",
    "@opentelemetry/core": "^1.0.0",
    "@opentelemetry/exporter-collector": "^0.25.0",
    "@opentelemetry/sdk-node": "^0.26.0"

What did you expect to see?
Metrics in cloudwatch.

What did you see instead?
Error sending metrics to exporter (or cloudwatch, i'm not sure?).

Environment
EKS.

Additional context
Add any other context about the problem here.

@mentos1386
Copy link
Author

Also providing the exporter logs:

[otel] 2021/11/03 14:52:27 AWS OTel Collector version: v0.14.0
[otel] 2021/11/03 14:52:27 find no extra config, skip it, err: open /opt/aws/aws-otel-collector/etc/extracfg.txt: no such file or directory
[otel] 2021-11-03T14:52:27.883Z info    service/collector.go:174        Applying configuration...
[otel] 2021-11-03T14:52:27.885Z info    builder/exporters_builder.go:259        Exporter was built.     {"kind": "exporter", "name": "awsxray"}
[otel] 2021-11-03T14:52:27.885Z info    builder/exporters_builder.go:259        Exporter was built.     {"kind": "exporter", "name": "awsemf"}
[otel] 2021-11-03T14:52:27.889Z info    builder/pipelines_builder.go:220        Pipeline was built.     {"pipeline_name": "traces", "pipeline_datatype": "traces"}
[otel] 2021-11-03T14:52:27.889Z info    builder/pipelines_builder.go:220        Pipeline was built.     {"pipeline_name": "metrics", "pipeline_datatype": "metrics"}
[otel] 2021-11-03T14:52:27.889Z info    builder/receivers_builder.go:228        Receiver was built.     {"kind": "receiver", "name": "otlp", "datatype": "metrics"}
[otel] 2021-11-03T14:52:27.889Z info    builder/receivers_builder.go:228        Receiver was built.     {"kind": "receiver", "name": "otlp", "datatype": "traces"}
[otel] 2021-11-03T14:52:27.890Z info    awsxrayreceiver@v0.38.0/receiver.go:59  Going to listen on endpoint for X-Ray segments  {"kind": "receiver", "name": "awsxray", "udp": "0.0.0.0:2000"}
[otel] 2021-11-03T14:52:27.896Z info    udppoller/poller.go:109 Listening on endpoint for X-Ray segments        {"kind": "receiver", "name": "awsxray", "udp": "0.0.0.0:2000"}
[otel] 2021-11-03T14:52:27.896Z info    awsxrayreceiver@v0.38.0/receiver.go:71  Listening on endpoint for X-Ray segments        {"kind": "receiver", "name": "awsxray", "udp": "0.0.0.0:2000"}
[otel] 2021-11-03T14:52:27.896Z info    builder/receivers_builder.go:228        Receiver was built.     {"kind": "receiver", "name": "awsxray", "datatype": "traces"}
[otel] 2021-11-03T14:52:27.896Z info    service/service.go:86   Starting extensions...
[otel] 2021-11-03T14:52:27.897Z info    extensions/extensions.go:38     Extension is starting...        {"kind": "extension", "name": "health_check"}
[otel] 2021-11-03T14:52:27.897Z info    healthcheckextension@v0.38.0/healthcheckextension.go:40 Starting health_check extension {"kind": "extension", "name": "health_check", "config": {"Port":0,"TCPAddr":{"Endpoint":"0.0.0.0:13133"}}}
[otel] 2021-11-03T14:52:27.897Z info    extensions/extensions.go:42     Extension started.      {"kind": "extension", "name": "health_check"}
[otel] 2021-11-03T14:52:27.897Z info    service/service.go:91   Starting exporters...
[otel] 2021-11-03T14:52:27.897Z info    builder/exporters_builder.go:40 Exporter is starting... {"kind": "exporter", "name": "awsemf"}
[otel] 2021-11-03T14:52:27.897Z info    builder/exporters_builder.go:48 Exporter started.       {"kind": "exporter", "name": "awsemf"}
[otel] 2021-11-03T14:52:27.897Z info    builder/exporters_builder.go:40 Exporter is starting... {"kind": "exporter", "name": "awsxray"}
[otel] 2021-11-03T14:52:27.897Z info    builder/exporters_builder.go:48 Exporter started.       {"kind": "exporter", "name": "awsxray"}
[otel] 2021-11-03T14:52:27.897Z info    service/service.go:96   Starting processors...
[otel] 2021-11-03T14:52:27.897Z info    builder/pipelines_builder.go:52 Pipeline is starting... {"pipeline_name": "traces", "pipeline_datatype": "traces"}
[otel] 2021-11-03T14:52:27.897Z info    builder/pipelines_builder.go:63 Pipeline is started.    {"pipeline_name": "traces", "pipeline_datatype": "traces"}
[otel] 2021-11-03T14:52:27.897Z info    builder/pipelines_builder.go:52 Pipeline is starting... {"pipeline_name": "metrics", "pipeline_datatype": "metrics"}
[otel] 2021-11-03T14:52:27.897Z info    builder/pipelines_builder.go:63 Pipeline is started.    {"pipeline_name": "metrics", "pipeline_datatype": "metrics"}
[otel] 2021-11-03T14:52:27.897Z info    service/service.go:101  Starting receivers...
[otel] 2021-11-03T14:52:27.897Z info    builder/receivers_builder.go:68 Receiver is starting... {"kind": "receiver", "name": "otlp"}
[otel] 2021-11-03T14:52:27.897Z info    otlpreceiver/otlp.go:68 Starting GRPC server on endpoint 0.0.0.0:4317   {"kind": "receiver", "name": "otlp"}
[otel] 2021-11-03T14:52:27.897Z info    otlpreceiver/otlp.go:86 Starting HTTP server on endpoint 0.0.0.0:55681  {"kind": "receiver", "name": "otlp"}
[otel] 2021-11-03T14:52:27.897Z info    builder/receivers_builder.go:73 Receiver started.       {"kind": "receiver", "name": "otlp"}
[otel] 2021-11-03T14:52:27.897Z info    builder/receivers_builder.go:68 Receiver is starting... {"kind": "receiver", "name": "awsxray"}
[otel] 2021-11-03T14:52:27.897Z info    awsxrayreceiver@v0.38.0/receiver.go:98  X-Ray TCP proxy server started  {"kind": "receiver", "name": "awsxray"}
[otel] 2021-11-03T14:52:27.898Z info    builder/receivers_builder.go:73 Receiver started.       {"kind": "receiver", "name": "awsxray"}
[otel] 2021-11-03T14:52:27.898Z info    healthcheck/handler.go:129      Health Check state change       {"kind": "extension", "name": "health_check", "status": "ready"}
[otel] 2021-11-03T14:52:27.898Z info    service/telemetry.go:92 Setting up own telemetry...
[otel] 2021-11-03T14:52:27.900Z info    service/telemetry.go:116        Serving Prometheus metrics      {"address": ":8888", "level": "basic", "service.instance.id": "6f6f17e5-cf7b-460b-aa80-8efe6d04775d", "service.version": "latest"}
[otel] 2021-11-03T14:52:27.900Z info    service/collector.go:230        Starting aws-otel-collector...  {"Version": "v0.14.0", "NumCPU": 2}
[otel] 2021-11-03T14:52:27.900Z info    service/collector.go:132        Everything is ready. Begin running and processing data.

And this is it's sidecar configuration:

        - name: otel
          image: public.ecr.aws/aws-observability/aws-otel-collector:v0.14.0
          env:
            - name: AWS_REGION
              value: 'us-east-1'
          imagePullPolicy: Always
          resources:
            limits:
              cpu: 256m
              memory: 512Mi
            requests:
              cpu: 32m
              memory: 24Mi

@anuraaga
Copy link
Contributor

anuraaga commented Nov 4, 2021

Hi @mentos1386 - it looks like OpenTelemetry JS is still using a version of the protocol from more than a year ago

https://github.com/open-telemetry/opentelemetry-js/tree/main/experimental/packages/opentelemetry-exporter-trace-otlp-proto

Recent versions of the collector do not support this version for metrics. It looks like this issue is the one to track to see support for the new proto and updated metrics code

open-telemetry/opentelemetry-js#2586

OpenTelemetry metrics are currently alpha and different languages have varying support, but all are fairly unstable right now. I would recommend holding off on using OpenTelemetry for metrics until a stable release if possible.

@mentos1386
Copy link
Author

mentos1386 commented Nov 4, 2021

@anuraaga thanks for such a fast reply.

Couple of issues for anyone wanting to track the progress of opentelemetry-js.

open-telemetry/opentelemetry-js#2480
open-telemetry/opentelemetry-js#2574
https://github.com/open-telemetry/opentelemetry-specification/milestone/15

@mentos1386
Copy link
Author

mentos1386 commented Nov 4, 2021

For anyone interested. A viable workaround is to use prometheus exporter and receiver instead.

@loganlinn
Copy link

loganlinn commented Dec 10, 2021

Another workaround is to use a version of collector <= v0.29.0.
In v0.30.0, the breaking changes around doubleSum were made.

@alolita alolita added question Further information is requested metrics Metrics related issue otel-bug OTEL upstream issues and bugs labels Jan 7, 2022
@alolita alolita closed this as completed Jan 7, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
metrics Metrics related issue otel-bug OTEL upstream issues and bugs question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants