You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
During a migration from regular ES to datastreams, I discovered I was unable to generate a data stream name with placeholders.
We are in a situation where dev teams can put messages onto an exchange with properties that describe their product and format. This was being routed to distinct logstash indexes (containing data with different schemas) with a config along the lines of
When attempting to change to using data streams, I found that the plugin did not seem to have any kind of replacement support.
What I have tried:
Replacements in the data_stream_name are ignored (also requires _index in the buffer config).
Using _index as a field in the record might have been doing the right thing, but the commits fail as it's not a permitted ES field. This happens even if we request fluent does not send the field.
Trying to use a different name with target_index_key is seemingly ignored (everything written to ceres-).
I did find a config (probably the middle one) that created all the expected datastreams but did not populate them with any documents. I have been unable to fully replicate that today.
During a migration from regular ES to datastreams, I discovered I was unable to generate a data stream name with placeholders.
We are in a situation where dev teams can put messages onto an exchange with properties that describe their product and format. This was being routed to distinct logstash indexes (containing data with different schemas) with a config along the lines of
Can you reference where those are missing in the attached gist that showed the variations on the config that were attempted?
(to quote the gist
# This was the first obvious thing, and the thing I hoped would work
# Just specify the same replacements in datastream name
<match to.es>
@id es_with_name_placeholder
@type elasticsearch_data_stream
hosts host.docker.internal:9200
http_backend typhoeus
reload_connections false
data_stream_name ceres-${product}-${logstream}
data_stream_template_name ceres-datastream
id_key _hash
<buffer product,logstream>
@type memory
chunk_limit_records 2
queued_chunks_limit_size 1
retry_max_times 0
</buffer>
</match>
# The above errors on the buffer config not containing _index
# I did a little poking around and determined that that field was likely the index/ds name in use
Problem
During a migration from regular ES to datastreams, I discovered I was unable to generate a data stream name with placeholders.
We are in a situation where dev teams can put messages onto an exchange with properties that describe their product and format. This was being routed to distinct logstash indexes (containing data with different schemas) with a config along the lines of
When attempting to change to using data streams, I found that the plugin did not seem to have any kind of replacement support.
What I have tried:
data_stream_name
are ignored (also requires_index
in the buffer config)._index
as a field in the record might have been doing the right thing, but the commits fail as it's not a permitted ES field. This happens even if we request fluent does not send the field.target_index_key
is seemingly ignored (everything written toceres-
).I did find a config (probably the middle one) that created all the expected datastreams but did not populate them with any documents. I have been unable to fully replicate that today.
Steps to replicate
The complete minimal test cases and logs for the situations described above can be found at https://gist.github.com/javajawa/4283666d2489b0d9d9abb8909d077476
Expected Behavior or What you need to ask
Are dynamically named data streams supported?
If so, how?
If not, can they be?
Using Fluentd and ES plugin versions
This situation has been tested using a docker image running with:
The cluster is ES 8.3.3. The templates in question are all very trivial (for now)
The text was updated successfully, but these errors were encountered: