Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

otelcol.processor.batch: new component #2333

Merged
merged 4 commits into from
Oct 11, 2022

Conversation

rfratto
Copy link
Member

@rfratto rfratto commented Oct 7, 2022

PR Description

Introduce a new otelcol.processor.batch component.

Which issue(s) this PR fixes

Closes #2285.

Notes to the Reviewer

PR Checklist

  • CHANGELOG updated
  • Documentation added
  • Tests updated

Introduce a `otelcol.processor.batch` component which wraps around the
upstream batch processor.

Closes grafana#2285.
@rfratto rfratto force-pushed the otelcol.processor.batch branch from d60c1ac to dfde58e Compare October 7, 2022 15:47
Copy link
Contributor

@karengermond karengermond left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

default, telemetry data will be dropped. To send telemetry data to other
components, configure the `metrics`, `logs`, and `traces` arguments
accordingly.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it OK that we don't have examples for otelcol.processor.batch and otelcol.processor.batch?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, I'm planning on adding examples after #2288 is implemented (since that would allow examples for an entire pipeline of OpenTelemetry Collector components)

SendBatchMaxSize uint32 `river:"send_batch_max_size,attr,optional"`

// Output configures where to send processed data. Required.
Output *otelcol.ConsumerArguments `river:"output,block"`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we want to reuse the forward_to terminology we used for Prometheus, or is this more in-line with what OTel does?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe, I think we need to think about it after we have a good set of otelcol components in. It might make sense for us to use different terminology just to help prevent confusing things (i.e., assume that you can use forward_to to send data to a prometheus.* receiver)

Copy link
Member

@tpaschalis tpaschalis left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@rfratto rfratto merged commit 21bda68 into grafana:main Oct 11, 2022
@rfratto rfratto deleted the otelcol.processor.batch branch October 11, 2022 12:47
@github-actions github-actions bot added the frozen-due-to-age Locked due to a period of inactivity. Please open new issues or PRs if more discussion is needed. label Mar 17, 2024
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Mar 17, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
frozen-due-to-age Locked due to a period of inactivity. Please open new issues or PRs if more discussion is needed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Flow: otelcol.processor.batch component
3 participants