Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Security Solution] data_stream.namespace field mapping missing from .internal.alerts-security.alerts-default indices #156060

Open
Tracked by #165878 ...
ccdta opened this issue Apr 27, 2023 · 17 comments
Assignees
Labels
bug Fixes for quality problems that affect the customer experience Feature:Detection Alerts Security Solution Detection Alerts Feature impact:high Addressing this issue will have a high level of impact on the quality/strength of our product. Team:Detection Engine Security Solution Detection Engine Area Team:Detections and Resp Security Detection Response Team Team: SecuritySolution Security Solutions Team working on SIEM, Endpoint, Timeline, Resolver, etc.

Comments

@ccdta
Copy link

ccdta commented Apr 27, 2023

Describe the bug:
the data_stream.namspace field cannot be filtered on .internal.alerts-security.alerts-default indices

Kibana/Elasticsearch Stack version:
8.7.0
Server OS version:

Browser and Browser OS versions:

Elastic Endpoint version:

Original install method (e.g. download page, yum, from source, etc.):

Functional Area (e.g. Endpoint management, timelines, resolver, etc.):

Steps to reproduce:

  1. Generate Security Alerts
  2. Kibana > Security > Alerts: if there are security alerts generated and a data_stream.namespace within the alert context. apply a KQL filter: data_stream.namespace: *
  3. Also in Discover, the running this filter on the .internal.alerts-security.alerts-default documents will return no results

Current behavior:
The data_stream.namespace, dataset, type in the .internal.alerts-security.alerts-default indices are not searchable. They do appear in each security alert documents.

We added these fields as a runtime field in the native .alerts-security.alerts-mappings components template and reindex, which resolved the issue. However, after every cluster upgrade it resets and we have to repeat this process.
{
"runtime": {
"data_stream.namespace": {
"type": "keyword"
}
}

Expected behavior:
These fields should be searchable for the alerts index

Screenshots (if relevant):

Errors in browser console (if relevant):

Provide logs and/or server output (if relevant):

Any additional context (logs, chat logs, magical formulas, etc.):

@ccdta ccdta added bug Fixes for quality problems that affect the customer experience Team: SecuritySolution Security Solutions Team working on SIEM, Endpoint, Timeline, Resolver, etc. triage_needed labels Apr 27, 2023
@elasticmachine
Copy link
Contributor

Pinging @elastic/security-solution (Team: SecuritySolution)

@MadameSheema MadameSheema added Team:Detections and Resp Security Detection Response Team Team:Detection Alerts Security Detection Alerts Area Team labels May 2, 2023
@elasticmachine
Copy link
Contributor

Pinging @elastic/security-detections-response (Team:Detections and Resp)

@MilkyEsquire
Copy link

@ccdta We are currently experiencing the same limitation with multiple clients and needing to filter the alerts. Can you possibly just expand on your current work around you are using? Where you are putting the fields etc? As this would be handy as a workaround for us too.

@ccdta
Copy link
Author

ccdta commented May 2, 2023

@MilkyEsquire
Workaround:

  1. In the 'alerts-security.alerts-mappings' component template, add the runtime field for data_stream.namespace (screenshot below)
  2. Rollover your current .internal.alerts-security.alerts-default-00000* index
  3. Create an index clone of an .internal.alerts-security.alerts-default-00000* index, however, now add the runtime field to the mapping
  4. Reindex an .internal.alerts-security.alerts-default-00000* index to the new cloned index
  5. The filters should now work

image

Note: Whenever there is a cluster upgrade. The component template will revert and the workaround has to be applied again.

@MilkyEsquire
Copy link

@ccdta
Great thanks for that, worked nicely. We submitted a ticket to Elastic Support around this missing feature a couple of months back also, so hopefully they will add this functionality in!

@yctercero yctercero added Team:Detection Engine Security Solution Detection Engine Area and removed Team:Detection Alerts Security Detection Alerts Area Team labels May 13, 2023
@ccdta
Copy link
Author

ccdta commented Jun 5, 2023

Elastic team, any updates on this?

@yctercero
Copy link
Contributor

@elastic/response-ops is this something you guys could look at?

@ymao1
Copy link
Contributor

ymao1 commented Jul 11, 2023

@yctercero We originally included the data_stream.* mappings in the ECS component template that the alerts indices reference but ran into issues because the detection rules use the mappings to determine which fields to copy from the source indices to the alert indices. Because the mapping for these are constant_keyword and the alert indices hold data from multiple different sources, we ran into issues because copying from multiple sources would break the constant_keyword mapping. We then omitted any fields mapped as constant_keyword (which ends up just being the data_stream.* fields). If we want to include them in the ECS component template, the detection rule build alert logic needs to be updated to not copy these fields from the source index to the alerts index, otherwise there will be many errors indexing.

cc @kqualters-elastic

@ymao1
Copy link
Contributor

ymao1 commented Jul 11, 2023

It was also discussed that we could include these fields but change the mapping to keyword instead of constant_keyword but diverging from the ECS mapping like this would show up in the data quality dashboard

cc @kqualters-elastic

@yctercero
Copy link
Contributor

Thanks for the background @ymao1. Apologies if I missed these convos earlier. I'll bring it up at our Advanced Correlation meeting to see what we want to do here.

cc @paulewing @peluja1012 @marshallmain

@fitz003
Copy link

fitz003 commented Mar 25, 2024

@yctercero I was curious if there was any update on this issue?

@LaZyDK
Copy link

LaZyDK commented Sep 2, 2024

Any updates?

@yctercero yctercero added the impact:high Addressing this issue will have a high level of impact on the quality/strength of our product. label Sep 28, 2024
@MSSP-BLKing
Copy link

As an MSSP, we use Elastic to monitor our clients environments. We have 1 use cases where having the data_stream.namespace mapped are quite important to us.

We measure and report on key metrics to each client, like number of alerts, number of closed alerts, number of alerts in progress. Unless I hack the index template, and replace it every upgrade, I can't separate this data amongst data_stream.namespace.

With our setup there's no reason to duplicate the SIEM rules for other spaces, I'm assuming that might be the other option. But adding the mapping to the index template works. Having an Alerts@Custom index component template would work. Seems to be an easy to implement solution.

Other uses for this could be filtering the alerts console for a specific client, this would especially be useful during an active cybersecurity event, where other clients who are physically separated from the client under attack are a lesser priority, and their alerts would only serve to confuse the incident response team.

This could also be helpful in workload balance amongst SOC analyst. As they are sometimes assigned to a client and not to the whole repository.

I'm not familiar enough with the deployment code and process from Github out to the production environments. But, at a bare minimum, and maybe as an additional request, deploying an Alerts@Custom index component template would allow adding mapping for this and other fields. And while you have your code cracked open, adding an Alerts@custom ingest processor would allow us to add an enrichment processor to our Alerts Index. I use this one to get the Max vulnerability priority​ rating for the host to the alert index. It adds an additional level of risk knowledge for the analyst.

@DigitalKraken
Copy link

DigitalKraken commented Oct 11, 2024

So we found another workaround for this that prevents you from having to modify a component template and re-index. This also means you don't have to apply the fix every time you upgrade the cluster. You can filter on 'kibana.alert.ancestors.index : *-NAMESPACE-*' (replace NAMESPACE with your namespace obviously). We have verified that this works.

@LaZyDK
Copy link

LaZyDK commented Dec 18, 2024

So we found another workaround for this that prevents you from having to modify a component template and re-index. This also means you don't have to apply the fix every time you upgrade the cluster. You can filter on 'kibana.alert.ancestors.index : -NAMESPACE-' (replace NAMESPACE with your namespace obviously). We have verified that this works.

This is a nice workaround. It works on 8.17.0.

But we would still like the data_stream.namespace field to be present going forward, Elastic ;)

@MSSP-BLKing
Copy link

So we found another workaround for this that prevents you from having to modify a component template and re-index. This also means you don't have to apply the fix every time you upgrade the cluster. You can filter on 'kibana.alert.ancestors.index : -NAMESPACE-' (replace NAMESPACE with your namespace obviously). We have verified that this works.

This is a nice workaround. It works on 8.17.0.

But we would still like the data_stream.namespace field to be present going forward, Elastic ;)

kibana.alert.ancestors.index: *NAMESPACE*

I had to use the * instead of the _- and the -_ but it did work! I'm not sure if it'll work in my dashboard but it works from the security alert console on 8.17.0.

Very much concur, still need the data_stream.namespace option too.

@yctercero
Copy link
Contributor

@LaZyDK @MSSP-BLKing we certainly hear you! :) Thank you for confirming the workaround.

We will look to prioritize this one and update when there's movement on our front. We appreciate your patience!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Fixes for quality problems that affect the customer experience Feature:Detection Alerts Security Solution Detection Alerts Feature impact:high Addressing this issue will have a high level of impact on the quality/strength of our product. Team:Detection Engine Security Solution Detection Engine Area Team:Detections and Resp Security Detection Response Team Team: SecuritySolution Security Solutions Team working on SIEM, Endpoint, Timeline, Resolver, etc.
Projects
None yet
Development

No branches or pull requests