Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

opentelemetry-cohttp: make sure to emit data before termination #41

Closed
tatchi opened this issue Aug 30, 2023 · 1 comment · Fixed by #76
Closed

opentelemetry-cohttp: make sure to emit data before termination #41

tatchi opened this issue Aug 30, 2023 · 1 comment · Fixed by #76
Labels
bug Something isn't working help wanted Extra attention is needed

Comments

@tatchi
Copy link
Contributor

tatchi commented Aug 30, 2023

I was playing around with this little example that uses opentelemetry_client_cohttp_lwt:

let run () =
  let open Lwt.Syntax in
  let* () =
    Opentelemetry_lwt.Trace.with_ "my trace" @@ fun _scope ->
    print_endline "bar" |> Lwt.return
  in
  Lwt.return_unit

let () =
  Opentelemetry_lwt.Globals.service_name := "my service";
  let config =
    Opentelemetry_client_cohttp_lwt.Config.make ~debug:true
      ~url:"http://localhost:4318" ()
  in
  Opentelemetry_client_cohttp_lwt.with_setup ~config () @@ fun () ->
  Lwt_main.run (run ())

And was surprised to not see any traces in the Jaeger UI (same example with ocurl works fine), especially because I could see a send span in logs

bar                                    
send spans { resource = ... }
opentelemetry: exiting…

Adding a little delay fix the issue

diff --git a/src/bin/main.ml b/src/bin/main.ml
index fc28bf1..1416c72 100644
--- a/src/bin/main.ml
+++ b/src/bin/main.ml
@@ -6,6 +6,7 @@ let run () =
     Opentelemetry_lwt.Trace.with_ "my trace" @@ fun _scope ->
     print_endline "bar" |> Lwt.return
   in
+  let* () = Lwt_unix.sleep 2.0 in
   Lwt.return_unit
 
 let () =

My understanding is that the cleanup function doesn't have the time to finish before the program terminate.

let cleanup () =
  if !debug_ then Printf.eprintf "opentelemetry: exiting…\n%!";
  Lwt.async (fun () ->
      let* () = emit_all_force httpc encoder in
      Httpc.cleanup httpc;
      Lwt.return ())

I see that we use a Lwt.async, which doesn't wait for the resolution of the promise. See for example this code:

let cleanup () =
  Lwt.async (fun () ->
      let open Lwt.Syntax in
      print_endline "start cleanup";
      (* simulate delay *)
      let* () = Lwt_unix.sleep 2.0 in
      print_endline "end cleanup";
      Lwt.return_unit)

let run () =
  print_endline "hello";
  cleanup () |> Lwt.return

let () = Lwt_main.run (run ())

And the output:

hello                                  
start cleanup

So in the case of opentelemetry_client_cohttp_lwt, the let* () = emit_all_force httpc encoder doesn't have time to complete before the program terminates.

Should we make sure that all data is emitted before the program terminates (both after "normal" completion and after explicit interruption)?

I guess one way to fix this would be to make the cleanup function return a unit Lw.t to be able to properly wait for its resolution. But the problem is that this is part of a common interface that different backends implement, so I don't think it's possible to force a lwt value. Maybe we can make it parametric or something?

@tatchi tatchi changed the title opentelemetry-cohttp: make sure to send traces before termination opentelemetry-cohttp: make sure to emit data before termination Aug 30, 2023
@c-cube c-cube added the bug Something isn't working label Dec 5, 2023
@c-cube c-cube added the help wanted Extra attention is needed label Jan 8, 2024
c-cube added a commit that referenced this issue Oct 17, 2024
in `Opentelemetry_client_cohttp_lwt.with_setup` we should now wait for
the cleanup to be done, by sneaking in a `unit Lwt.u` that is only
resolved after the cleanup is done.

close #41
@c-cube
Copy link
Member

c-cube commented Oct 17, 2024

I think I found a sneaky solution to that problem, see #76 :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants