Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add helper function trace.Start (fixes #1090) #1674

Closed
wants to merge 1 commit into from

Conversation

pci
Copy link

@pci pci commented Mar 8, 2021

This is an implementation of #1090 with the motivation for the change in that issue. I've tried to follow the test patterns of the surrounding tests as best I could. The test case internally uses ContextWithSpan and SpanFromContext so it's not a true unit test of solely that function, but it seemed the best way to test the functionality.

@linux-foundation-easycla
Copy link

linux-foundation-easycla bot commented Mar 8, 2021

CLA Signed

The committers are authorized under a signed CLA.

@codecov
Copy link

codecov bot commented Mar 8, 2021

Codecov Report

Merging #1674 (61667a8) into main (875a258) will increase coverage by 0.0%.
The diff coverage is 100.0%.

Impacted file tree graph

@@          Coverage Diff          @@
##            main   #1674   +/-   ##
=====================================
  Coverage   77.3%   77.3%           
=====================================
  Files        128     128           
  Lines       6681    6684    +3     
=====================================
+ Hits        5166    5171    +5     
+ Misses      1268    1266    -2     
  Partials     247     247           
Impacted Files Coverage Δ
trace/trace.go 86.9% <100.0%> (+0.3%) ⬆️
exporters/otlp/otlpgrpc/connection.go 88.7% <0.0%> (+1.8%) ⬆️

@pci
Copy link
Author

pci commented Mar 8, 2021

I've signed a CLA, and not quite sure why it's saying it can't find my UserID on that commit.

@pci pci force-pushed the feat-helper-trace-start branch from 2a0f3d5 to d301f74 Compare March 8, 2021 09:56
@pci pci force-pushed the feat-helper-trace-start branch from d301f74 to 61667a8 Compare March 8, 2021 10:05
@pci
Copy link
Author

pci commented Mar 8, 2021

There we go, was an issue with multiple email addresses. Sorted now.

@Aneurysm9
Copy link
Member

The one thing that worries me about this approach is that it takes away knowledge of the TracerProvider and Tracer used to create the new Span. I could see this potentially being a problem if middleware had established its own TracerProvider with a different set of SpanProcessors and exporting to a different location than the global TracerProvider that the user is likely to expect. In that case if the Span in the current Context was created from that middleware TracerProvider then the newly created Span may not export to the location the user expects or may be processed in unexpected ways.

Would it be preferable to use the global TracerProvider to get a Tracer rather than extracting the Tracer from the Span contained in the context?

@pci
Copy link
Author

pci commented Mar 9, 2021

@Aneurysm9 Ah, I think I've misunderstood a key difference between OpenCensus and OpenTelemetry, and I opened the underlying 'issue' 7 months ago and best practice has matured since then.

In OpenCensus each tracer had to be wired up to an exporter (in practical usage there was only ever one tracer), so in coming to OpenTelemetry my assumption was that to get cohesive trace outputs in your exporter that either all packages would have to use the same tracer, or each package would have to export its tracer so that the main entry code could wire them all up to the exporter. And if you used multiple tracer that the outputs wouldn't be linked in the final output of the TraceProvider.

Whereas reading some of the more recent changes it now seems much clearer that I probably misunderstood and that in OpenTelemetry that the main program defines the trace provider and that each package is free to make it's own tracer from that provider and you don't need to worry about passing through or wiring in, and you'll still get linked outputs. For example if you had a main and a sub packages:

foo/main.go

func main() {
    // setup, say, stackdriver exporter `tp`
    otel.SetTracerProvider(tp)
    
    // use default traceprovider
    tr := otel.Tracer("github.com/pci/.../foo")
    ctx, span := tr.Start(ctx, "b")
    defer span.End()
    b.Run(ctx)
}

and a second package foo/b/b.go

var (
  // use default traceprovider
  tr = otel.Tracer("github.com/pci/.../foo/b")
)

func Run(ctx context.Context) {
  ctx, otherSpan := tr.Start(ctx, "other_things")
  defer otherSpan.End()
  // do other stuff
}

then in, again say, stackdriver you'll still get a cohesive waterfall?

TL;DR - I think I misunderstood opentelemetry best practise.

@Aneurysm9 if you wouldn't mind confirming the above, just in case I got that wrong, but then I'll close this PR and the linked issue. And thank you for your time, much appreciated!

@Aneurysm9
Copy link
Member

in OpenTelemetry that the main program defines the trace provider and that each package is free to make it's own tracer from that provider and you don't need to worry about passing through or wiring in, and you'll still get linked outputs.

This is correct. The TracerProvider in OTel holds the exporter and other SpanProcessors and all Tracers created from it will go through the same processing pipeline. I think in practice it will likely remain the case that most systems only ever have a single TracerProvider, though even in that case each Tracer is intended to be named after the instrumenting package. Re-using Tracer instances pulled from the active span may result in using the right export pipeline but wrong instrumentation name (not the end of the world, but potentially confusing) or possibly using the wrong export pipeline if the span in the context you received was created by an instrumentation using a distinct TracerProvider.

That last scenario is still potentially problematic and can result in broken trace graphs if the spans are not exported to the same destination. I'm not sure we have a good way of handling this other than to recommend that instrumentations that wish to have a separate trace graph be very careful about providing context.Context that contain an active Span to code outside their control.

@MrAlias
Copy link
Contributor

MrAlias commented Mar 25, 2021

@pci your comment about best practices looks accurate to me. Does that mean this PR and related Issue can be closed?

@pci
Copy link
Author

pci commented Mar 26, 2021

@MrAlias apologies for the late reply, had a family emergency come up. Yes, now I understand best practise better I think this PR would be at best not-needed, at worse an anti-pattern! I'll close the PR and issue and leave a comment with an explanation. Thanks again.

@pci pci closed this Mar 26, 2021
@MrAlias
Copy link
Contributor

MrAlias commented Mar 26, 2021

@MrAlias apologies for the late reply, had a family emergency come up. Yes, now I understand best practise better I think this PR would be at best not-needed, at worse an anti-pattern! I'll close the PR and issue and leave a comment with an explanation. Thanks again.

No worries! Thanks for the contribution and feedback 😄

@rizalgowandy
Copy link

I am so used to using opentracing I also thought the child creation can be automated using just context
without predefined global package variable tracer.
Oh well, I guess it should do.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants