Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[backport -> release/3.4.x] perf(opentelemetry): increase queue max batch size #12489

Merged
merged 1 commit into from
Feb 1, 2024

Conversation

team-gateway-bot
Copy link
Collaborator

Automated backport to release/3.4.x, triggered by a label in #12488.

Original description

Summary

The max batch size for Opentelemetry was set to the default value: 1 the value actually refers to the number of spans in a batch, so we are increasing the default value to 200 which corresponds to what the default value used to be with the "old" queue implementation.

Checklist

Issue reference

https://konghq.atlassian.net/browse/KAG-3173

The max batch size for Opentelemetry was set to the default value: 1
the value actually refers to the number of spans in a batch, so we are
increasing the default value to 200 which corresponds to what the
default value used to be with the "old" queue implementation.

(cherry picked from commit c7cb900)
@bungle bungle merged commit 0d414b4 into release/3.4.x Feb 1, 2024
37 checks passed
@bungle bungle deleted the backport-12488-to-release/3.4.x branch February 1, 2024 20:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cherry-pick kong-ee schedule this PR for cherry-picking to kong/kong-ee core/wasm Everything relevant to [proxy-]wasm plugins/opentelemetry schema-change-noteworthy size/XS
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants