Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Document bulk_max_size for all outputs, not just Elasticsearch #586

Closed
dedemorton opened this issue Dec 22, 2015 · 0 comments
Closed

Document bulk_max_size for all outputs, not just Elasticsearch #586

dedemorton opened this issue Dec 22, 2015 · 0 comments
Assignees
Labels

Comments

@dedemorton
Copy link
Contributor

The doc about configuration options currently describes the bulk_max_size option for Elasticsearch output, but does not contain descriptions of this option for other output types (Logstash, console, and file). Need to add these descriptions to the documentation.

Here are some comments by @urso carried over from PR #568

So we've got multiple output plugins:

Elasticsearch
logstash
console
file

The default bulk_max_size is used for all output plugins but elasticsearch, which sets the default to 50.

This options sets maximum number of events that can be combined internally into batches and will be publishable by the output plugins =>

if beat tries to send single events, the events are collected into batches

if beat tries to publish large batch of event (bigger bulk_max_size), the batch will be split.

Bigger batch sizes can improve performance due to ammortizing per event sending overhead. On the other hand to big batch sizes can increase processing time such that queues in logstash/elasticsearch can not be processed -> APIs return errors, connections get killed or publish requests time out. This increases latency and lowers throughput for indexing events.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant