Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some info logs missing in stdout if telemetry logs level info is used #5214

Closed
Mario-Hofstaetter opened this issue Apr 14, 2022 · 3 comments
Closed
Labels
bug Something isn't working Stale

Comments

@Mario-Hofstaetter
Copy link

Mario-Hofstaetter commented Apr 14, 2022

Describe the bug

TL;DR: Some info log messages from otelcol telemetry are missing in stdout when using log level info.

I started otelcol with telemetry logs level on debug and got these logs in stdout.

Notice the log level is info for all of them:

❯ .\otelcol.exe --config .\otel-collector-config.yaml  | grep Endpoint

{"level":"info","ts":1649944125.978369,"caller":"healthcheckextension@v0.48.0/healthcheckextension.go:44","msg":"Starting health_check
extension","kind":"extension","name":"health_check","config":{"Port":0,"TCPAddr":{"Endpoint":"localhost:13133"},"Path":"/","CheckCollectorPipeline":{"Enabled":false,"Interval":"5m","ExporterFailureThreshold":5}}}
{"level":"info","ts":1649944125.9843705,"caller":"zpagesextension/zpagesextension.go:53","msg":"Starting zPages extension","kind":"extension","name":"zpages","config":{"TCPAddr":{"Endpoint":"localhost:42424"}}}
{"level":"info","ts":1649944125.9853702,"caller":"zapgrpc/zapgrpc.go:174","msg":"[core] parsed dial target is: {Scheme:ax-monitor-01.ax.tech Authority: Endpoint:14250 URL:{Scheme:ax-monitor-01.ax.tech Opaque:14250 User: Host: Path: RawPath: ForceQuery:false RawQuery:
Fragment: RawFragment:}}","grpc_log":true}
{"level":"info","ts":1649944125.9853702,"caller":"zapgrpc/zapgrpc.go:174","msg":"[core] parsed dial target is: {Scheme:passthrough Authority: Endpoint:ax-monitor-01.ax.tech:14250 URL:{Scheme:passthrough Opaque: User: Host: Path:/ax-monitor-01.ax.tech:14250 RawPath:
ForceQuery:false RawQuery: Fragment: RawFragment:}}","grpc_log":true}
{"level":"info","ts":1649944125.9989312,"caller":"otlpreceiver/otlp.go:69","msg":"Starting GRPC server on endpoint localhost:4317","kind":"receiver","name":"otlp"}
{"level":"info","ts":1649944125.9998899,"caller":"otlpreceiver/otlp.go:87","msg":"Starting HTTP server on endpoint localhost:4318","kind":"receiver","name":"otlp"}

When setting log level to info in telemetry section, some messages are gone:

❯ .\otelcol.exe --config .\otel-collector-config.yaml  | grep Endpoint

{"level":"info","ts":1649944166.9116242,"caller":"zpagesextension/zpagesextension.go:53","msg":"Starting zPages extension","kind":"extension","name":"zpages","config":{"TCPAddr":{"Endpoint":"localhost:42424"}}}
{"level":"info","ts":1649944166.9116242,"caller":"healthcheckextension@v0.48.0/healthcheckextension.go:44","msg":"Starting health_check
extension","kind":"extension","name":"health_check","config":{"Port":0,"TCPAddr":{"Endpoint":"localhost:13133"},"Path":"/","CheckCollectorPipeline":{"Enabled":false,"Interval":"5m","ExporterFailureThreshold":5}}}
{"level":"info","ts":1649944166.9131765,"caller":"otlpreceiver/otlp.go:69","msg":"Starting GRPC server on endpoint localhost:4317","kind":"receiver","name":"otlp"}
{"level":"info","ts":1649944166.9136999,"caller":"otlpreceiver/otlp.go:87","msg":"Starting HTTP server on endpoint localhost:4318","kind":"receiver","name":"otlp"}

To be precise, those mentioning the jaeger exporter endpoint (Port 14250) I was debugging, have disappeared?

Steps to reproduce

Full config at the end. I use this config under service: , setting level to debug prints the messages mentioned above.

  telemetry:
    logs:
      level: info
      encoding: json
      output_paths: ["stdout"]
      error_output_paths: ["stdout"]
    metrics:
      address: localhost:8888

Using INFO in CAPS does not change the behavior.

What did you expect to see?

Setting log level to info should not hide any info messages?

What did you see instead?

Some info messages no longer printed to stdout!

What version did you use?

otelcol version 0.48.0

What config did you use?

Full configfile if relevant to the issue:
exporters:
  jaeger:
    endpoint: ax-monitor-01.ax.tech:14250
    tls:
      insecure: true
  prometheus:
    endpoint: 0.0.0.0:7299
    send_timestamps: true

receivers:
  jaeger:
    protocols:
      thrift_compact:
        endpoint: localhost:6831
  otlp:
    protocols:
      grpc:
        endpoint: localhost:4317
      http:
        endpoint: localhost:4318
  prometheus:
    config:
      scrape_configs:
        - job_name: localmetrics
          scrape_interval: 17s

          tls_config:
            insecure_skip_verify: true

          static_configs:
            - targets: [localhost:8888] # Self diagnostic metrics of otelcol
              labels:
                from_app: otelcol
          file_sd_configs:
            - files:
              - "C:/Program Files/OpenTelemetry/OTEL Collector/metric-targets/*.yaml"

processors:
  memory_limiter:
    check_interval: 1s
    limit_mib: 2000
  batch:

extensions:
  health_check:
    endpoint: localhost:13133
  zpages:
    endpoint: localhost:42424
  memory_ballast:
    size_mib: 256

service:
  extensions: [health_check , zpages]
  pipelines:

    traces:
      receivers: [otlp , jaeger]
      processors: [memory_limiter , batch]
      exporters: [jaeger]

    metrics:
      receivers: [prometheus]
      processors: [memory_limiter , batch]
      exporters: [prometheus]

  # Otel-Collector Self-Diagnostics
  telemetry:
    logs:
      level: info
      encoding: json
      output_paths: ["stdout"]
      error_output_paths: ["stdout"]
    metrics:
      address: localhost:8888

Environment
Windows 10 21H2

Additional context

@Mario-Hofstaetter Mario-Hofstaetter added the bug Something isn't working label Apr 14, 2022
@jpkrohling
Copy link
Member

jpkrohling commented Apr 14, 2022

Interesting. The Jaeger ones are actually logged by the gRPC layer, and I remember we had to hook the gRPC logger with our main logger. Perhaps there's something wrong with this hook? For the record, I would expect such log entries to be debug, not info.

@Mario-Hofstaetter
Copy link
Author

The Jaeger ones are actually logged by the gRPC layer

That explains why I could not even find that log message in the git repo here and otelcol-contrib 🙈

I would expect such log entries to be debug, not info.

I'd be glad if during startup, essential configuration, like endpoints where otelcol is pushing to, is logged once with info level. I struggled with environment variables (see also #5223), which is why I hunted down those logs.

@jpkrohling
Copy link
Member

The connection state changes are indeed logged at our side, and at INFO:

https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/c7fe9fd455107165cad3b9231a9a48781c237960/exporter/jaegerexporter/exporter.go#L186

But finer-grained events are logged at the gRPC level at debug.

@github-actions github-actions bot added the Stale label Apr 18, 2024
@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale May 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working Stale
Projects
None yet
Development

No branches or pull requests

2 participants