-
Notifications
You must be signed in to change notification settings - Fork 8.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Ingest Manager] Do not index every saved object field #70162
[Ingest Manager] Do not index every saved object field #70162
Conversation
Pinging @elastic/ingest-management (Team:Ingest Management) |
@@ -65,9 +65,9 @@ const savedObjectTypes: { [key: string]: SavedObjectsType } = { | |||
config_revision: { type: 'integer' }, | |||
config_newest_revision: { type: 'integer' }, | |||
default_api_key_id: { type: 'keyword' }, | |||
default_api_key: { type: 'keyword' }, | |||
default_api_key: { type: 'binary', index: false }, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
binary? I assume this is something SO specific.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes the doc for encrypted saved object, recommend to put encrypted field as binary
enabled: { type: 'boolean', index: false }, | ||
processors: { type: 'keyword', index: false }, | ||
config: { type: 'flattened', index: false }, | ||
vars: { type: 'flattened', index: false }, | ||
streams: { | ||
type: 'nested', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nested rings some alarm bells on my end. Especially as we have it twice. Is this nested docs in ES. If we don't need to query any parts inside inputs, lets make sure all of it is just an object with enabled: false
. Otherwise ES will create lots of documents and slow down things.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just uptated the inputs to be enabled: false
A more generic question: How does Kibana handle mapping updates between versions. What if the mapping from 7.9 is not compatible with the one from 7.10? |
dataset: { type: 'keyword', index: false }, | ||
processors: { type: 'keyword', index: false }, | ||
config: { type: 'flattened', index: false }, | ||
agent_stream: { type: 'flattened', index: false }, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note that this change by itself won't resolve #69682.
See this comment and the one below.
I got with working by also adding doc_values: false
. But maybe you find a way you like better, like changing the type.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I changed it to have the whole field as enabled: false
and just tested it it seems to work
If Kibana see a schema change it's going to create a new index and copy documents (and if there is a migration function use it to copy documents) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Did not test it locally but change looks good. We should try as little fields as we need.
This reminds me also of #43673
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tested locally and WFM
@elasticmachine merge upstream |
💚 Build SucceededBuild metrics
History
To update your PR or re-run it, just comment with: |
Summary
Resolves #69917 #69682
We should not index all of our saved object fields, I removed all the ones I know that are not used for search.
This will also fix integrations with large variables.