Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: warn when hooked module is already loaded #2926

Merged
merged 2 commits into from
Apr 27, 2022

Conversation

nozik
Copy link
Contributor

@nozik nozik commented Apr 25, 2022

Which problem is this PR solving?

A common pitfall for OTEL JS is modules that are required before the require-in-the-middle hook is set. In many cases, the end users don't understand why certain instrumentations aren't working, while others do.

Short description of the changes

Check the require cache when initializing each instrumentation, and alert if the module was already loaded. In case it was - log a warning.

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

How Has This Been Tested?

Manually

Checklist:

  • Followed the style guidelines of this project
  • Unit tests have been added
  • Documentation has been updated

@nozik nozik requested a review from a team April 25, 2022 04:45
@nozik nozik force-pushed the warn_on_late_require_hook branch from cfd43c5 to dc87a89 Compare April 25, 2022 04:47
@nozik
Copy link
Contributor Author

nozik commented Apr 25, 2022

@vmarchaud As discussed, FYI

@codecov
Copy link

codecov bot commented Apr 25, 2022

Codecov Report

Merging #2926 (19c3f6c) into main (ce7e98f) will decrease coverage by 0.61%.
The diff coverage is n/a.

❗ Current head 19c3f6c differs from pull request most recent head 051b5a5. Consider uploading reports for the commit 051b5a5 to get more accurate results

@@            Coverage Diff             @@
##             main    #2926      +/-   ##
==========================================
- Coverage   92.81%   92.19%   -0.62%     
==========================================
  Files         184       69     -115     
  Lines        6066     1999    -4067     
  Branches     1296      437     -859     
==========================================
- Hits         5630     1843    -3787     
+ Misses        436      156     -280     
Impacted Files Coverage Δ
...ckages/opentelemetry-sdk-trace-web/karma.worker.js 0.00% <0.00%> (-100.00%) ⬇️
...s/opentelemetry-core/src/platform/node/sdk-info.ts 0.00% <0.00%> (-100.00%) ⬇️
...opentelemetry-core/src/platform/node/globalThis.ts 0.00% <0.00%> (-100.00%) ⬇️
...pentelemetry-core/src/platform/node/performance.ts 0.00% <0.00%> (-100.00%) ⬇️
...emetry-api-metrics/src/platform/node/globalThis.ts 0.00% <0.00%> (-100.00%) ⬇️
...lemetry-resources/src/detectors/ProcessDetector.ts 31.81% <0.00%> (-68.19%) ⬇️
...ckages/opentelemetry-exporter-zipkin/src/zipkin.ts 84.48% <0.00%> (-15.52%) ⬇️
...lemetry-resources/src/detectors/BrowserDetector.ts 93.33% <0.00%> (-6.67%) ⬇️
...strumentation/src/platform/node/instrumentation.ts
...metry-instrumentation-grpc/src/grpc/serverUtils.ts
... and 113 more

Copy link
Member

@blumamir blumamir left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.
I remember we had it in the past and how helpful it is for debugging.
Added a few nits about the message text.

@vmarchaud
Copy link
Member

I remember we had it in the past and how helpful it is for debugging.

Yeah i actually checked because i remembered writting it, it seems that it has been lost when we moved from plugin to instrumentation

@blumamir
Copy link
Member

I remember we had it in the past and how helpful it is for debugging.

Yeah i actually checked because i remembered writting it, it seems that it has been lost when we moved from plugin to instrumentation

And I remember that it used to still try and patch modules that were required before instrumentation enabled (from require.cache). Wonder if that was removed intentionally or not.

@nozik nozik force-pushed the warn_on_late_require_hook branch from 1777fc6 to 93e9302 Compare April 25, 2022 10:12
@nozik nozik requested review from vmarchaud and blumamir April 25, 2022 10:13
@rauno56
Copy link
Member

rauno56 commented Apr 25, 2022

@nozik, do you think you could add a test for it?

I think the messaging coming with a @opentelemetry/... prefix doesn't need to mention OTel in the message, but I don't think it's a deal breaker either way.

@blumamir
Copy link
Member

I think the messaging coming with a @opentelemetry/... prefix doesn't need to mention OTel in the message

@rauno56 What do you mean by that?

@rauno56
Copy link
Member

rauno56 commented Apr 25, 2022

I think the messaging coming with a @opentelemetry/... prefix doesn't need to mention OTel in the message
`
@rauno56 What do you mean by that?

diag logs are prefixed with the package name(or another id) they're coming from and in this case look something like this:

@opentelemetry/instrumentation-express A log from Express instrumentation

Even if one's unfamiliar with OTel, they can make the context out from the prefix.

@nozik nozik force-pushed the warn_on_late_require_hook branch from aa6dc65 to 4021e62 Compare April 26, 2022 06:48
@nozik nozik force-pushed the warn_on_late_require_hook branch from 4021e62 to 051b5a5 Compare April 26, 2022 06:49
@dyladan dyladan merged commit 60a17ee into open-telemetry:main Apr 27, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants