-
Notifications
You must be signed in to change notification settings - Fork 818
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: BatchExporter should export continuously when batch size is reached #3958
fix: BatchExporter should export continuously when batch size is reached #3958
Conversation
Codecov Report
@@ Coverage Diff @@
## main #3958 +/- ##
==========================================
- Coverage 92.25% 92.25% -0.01%
==========================================
Files 331 331
Lines 9465 9473 +8
Branches 1997 1999 +2
==========================================
+ Hits 8732 8739 +7
- Misses 733 734 +1
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
While this works, it has a major issue which I also ran into while trying to solve this problem. If many spans are created quickly, the concurrency of the export is unbounded and this can result in an unbounded number of concurrent exports. Please take a look at #3828 for a solution which takes this into account and limits concurrency. Right now the biggest TODOs are related to testing, but I would appreciate comments on the main body of code if you have any.
packages/opentelemetry-sdk-trace-base/src/export/BatchSpanProcessorBase.ts
Show resolved
Hide resolved
packages/opentelemetry-sdk-trace-base/test/common/export/BatchSpanProcessorBase.test.ts
Show resolved
Hide resolved
packages/opentelemetry-sdk-trace-base/test/common/export/BatchSpanProcessorBase.test.ts
Show resolved
Hide resolved
packages/opentelemetry-sdk-trace-base/test/common/export/BatchSpanProcessorBase.test.ts
Show resolved
Hide resolved
packages/opentelemetry-sdk-trace-base/test/common/export/BatchSpanProcessorBase.test.ts
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Dismissed my previous rejection. Can we add a test to ensure the concurrency is actually limited?
It looks like you've modified some existing tests and haven't added any new ones, but it does look like one of the ones you modified tests the eager behavior. There are also some changes to the tests that aren't immediately obvious why you made them. Made comments on those specific tests.
/cc @martinkuba looks like this actually doesn't have the same concurrency issue. He limited it in a slightly different way than i did but it requires a lot less change so I might actually prefer this solution. |
packages/opentelemetry-sdk-trace-base/test/common/export/BatchSpanProcessorBase.test.ts
Show resolved
Hide resolved
This PR is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 14 days. |
return flush(); | ||
} | ||
if (this._timer !== undefined) return; | ||
this._timer = setTimeout(() => flush(), this._scheduledDelayMillis); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice to have
It would be "nice" to clear (set to undefined) the this._timer
just to avoid calling clearTimeout()
with an expired timer. This also would allow (in a future PR) to provide some slightly better performance rather than always calling _clearTimeout()`
Which problem is this PR solving?
The batch export processor should trigger an export when either of these conditions are met:
scheduledDelayMillis
since the last exportmaxExportBatchSize
items in itExample from the dotnet implementation https://github.com/open-telemetry/opentelemetry-dotnet/blob/main/src/OpenTelemetry/BatchExportProcessor.cs#L265
Relevant portion of the specification https://github.com/open-telemetry/opentelemetry-specification/blob/main/specification/trace/sdk.md?plain=1#L610
Fixes #3094