From 2ae30f1aad27db15575b91fac431bfb67008c2bc Mon Sep 17 00:00:00 2001 From: Benjamin Ironside Goldstein Date: Mon, 18 Nov 2024 13:27:18 -0800 Subject: [PATCH 1/6] Adds web crawler example to Knowledge Base doc --- docs/AI-for-security/knowledge-base.asciidoc | 58 ++++++++++++++++++-- 1 file changed, 53 insertions(+), 5 deletions(-) diff --git a/docs/AI-for-security/knowledge-base.asciidoc b/docs/AI-for-security/knowledge-base.asciidoc index 2f5414822b..a2b8340647 100644 --- a/docs/AI-for-security/knowledge-base.asciidoc +++ b/docs/AI-for-security/knowledge-base.asciidoc @@ -44,7 +44,7 @@ image::images/knowledge-base-assistant-menu-dropdown.png[AI Assistant's dropdown [discrete] === Option 2: Enable Knowledge Base from the Security AI settings -. To open Security AI settings, use the {kibana-ref}/introduction.html#kibana-navigation-search[global search field] to find "AI Assistant for Security." +. To open **Security AI settings**, use the {kibana-ref}/introduction.html#kibana-navigation-search[global search field] to find "AI Assistant for Security." . On the **Knowledge Base** tab, click **Setup Knowledge Base**. If the button doesn't appear, Knowledge Base is already enabled. image::images/knowledge-base-assistant-settings-kb-tab.png[AI Assistant's settings menu open to the Knowledge Base tab] @@ -57,7 +57,7 @@ When Knowledge Base is enabled, AI Assistant receives `open` or `acknowledged` a To enable Knowledge Base for alerts: . Ensure that knowledge base is <>. -. Use the slider on the Security AI settings' Knowledge Base tab to select the number of alerts to send to AI Assistant. Click **Save**. +. Use the slider on the **Security AI settings** page's Knowledge Base tab to select the number of alerts to send to AI Assistant. Click **Save**. NOTE: Including a large number of alerts may cause your request to exceed the maximum token length of your third-party generative AI provider. If this happens, try selecting a lower number of alerts to send. @@ -65,7 +65,7 @@ NOTE: Including a large number of alerts may cause your request to exceed the ma [[knowledge-base-add-knowledge]] == Add knowledge -To view all knowledge base entries, go to the Security AI settings and select the **Knowledge Base** tab. You can add individual documents or entire indices containing multiple documents. Each entry in the Knowledge Base (a document or index) has a **Sharing** setting of `private` or `global`. Private entries apply to the current user only and do not affect other users in the {kib} space, whereas global entries affect all users. Each entry can also have a `Required knowledge` setting, which means it will be included as context for every message sent to AI Assistant. +To view all knowledge base entries, go to **Security AI settings** and select the **Knowledge Base** tab. You can add individual documents or entire indices containing multiple documents. Each entry in the Knowledge Base (a document or index) has a **Sharing** setting of `private` or `global`. Private entries apply to the current user only and do not affect other users in the {kib} space, whereas global entries affect all users. Each entry can also have a `Required knowledge` setting, which means it will be included as context for every message sent to AI Assistant. NOTE: When you enable Knowledge Base, it comes pre-populated with articles from https://www.elastic.co/security-labs[Elastic Security Labs], current through September 30, 2024, which allows AI Assistant to leverage Elastic's security research during your conversations. This enables it to answer questions such as, β€œAre there any new tactics used against Windows hosts that I should be aware of when investigating my alerts?” @@ -75,7 +75,7 @@ NOTE: When you enable Knowledge Base, it comes pre-populated with articles from Add an individual document to Knowledge Base when you want AI Assistant to remember a specific piece of information. -. To open Security AI settings, use the {kibana-ref}/introduction.html#kibana-navigation-search[global search field] to find "AI Assistant for Security." Select the **Knowledge Base** tab. +. To open **Security AI settings**, use the {kibana-ref}/introduction.html#kibana-navigation-search[global search field] to find "AI Assistant for Security." Select the **Knowledge Base** tab. . Click **New β†’ Document** and give it a name. . Under **Sharing**, select whether this knowledge should be **Global** or **Private**. . Write the knowledge AI Assistant should remember in the **Markdown text** field. @@ -108,7 +108,7 @@ Add an index as a knowledge source when you want new information added to that i IMPORTANT: Indices added to Knowledge Base must have at least one field mapped as {ref}/semantic-text.html[semantic text]. -. To open Security AI settings, use the {kibana-ref}/introduction.html#kibana-navigation-search[global search field] to find "AI Assistant for Security." Select the **Knowledge Base** tab. +. To open **Security AI settings**, use the {kibana-ref}/introduction.html#kibana-navigation-search[global search field] to find "AI Assistant for Security." Select the **Knowledge Base** tab. . Click **New β†’ Index**. . Name the knowledge source. . Under **Sharing**, select whether this knowledge should be **Global** or **Private**. @@ -136,3 +136,51 @@ Refer to the following video for an example of adding an index to Knowledge Base
++++ ======= + +[discrete] +[[knowledge-base-crawler-or-connector]] +=== Add knowledge with a connector or web crawler + +You can use an {es} connector or web crawler to create an index that contains data you want to add to Knowledge Base. + +This section provides an example of adding a threat intelligence feed to Knowledge Base using a web crawler. For more information on adding data to {es} using a connector, refer to {ref}/es-connectors.html[Ingest data with Elastic connectors]. For more information on web crawlers, refer to {ref}/crawler.html[Elastic web crawler]. + +[discrete] +==== Use a web crawler to add threat intelligence to Knowledge Base + +First, you'll need to set up a web crawler to add the desired data to an index, then you'll need to add that index to Knowledge Base. + +. From the **Search** section of {kib}, find **Web crawlers** in the navigation menu or use the {kibana-ref}/introduction.html#kibana-navigation-search[global search field]. +. Click **New web crawler**. +.. Under **Index name**, name the index where the data from your new web crawler will be stored, for example `threat_intelligence_feed_1`. Click **Create index**. +.. Under **Domain URL**, enter the URL where the web crawler should collect data. Click **Validate Domain** to test it, then **Add domain**. +. The previous step opens a page with the details of your new crawler. Go to its **Mappings** tab, then click **Add field**. ++ +NOTE: Remember, each index added to Knowledge Base must have at least one semantic text field. +.. Under **Field type**, select `Semantic text`. Under **Select an inference endpoint**, select `elastic-security-ai-assistant-elser2`. Click **Add field**, then **Save mapping**. +. Go to the **Scheduling** tab. Enable the **Enable recurring crawls with the following schedule** setting, and define your desired schedule. +. Go to the **Manage Domains** tab. Select the domain associated with your new web crawler, then go the its **Crawl rules** tab and click **Add crawl rule**. +.. Under **Policy**, select `Allow`. Under **Rule**, select `Contains`. Under **Path pattern**, enter your path pattern, for example `threat-intelligence`. Click **Save**. +.. Click **Add crawl rule** again. Under **Policy**, select `Disallow`. Under **Rule**, select `Regex`. Under **Path pattern**, enter `.*`. Click **Save**. +.. Click **Crawl**, then **Crawl all domains on this index**. A message appears that says "Successfully scheduled a sync, waiting for a connector to pick it up". +. The crawl process will take longer for larger data sources. Once it finishes, your new web crawler's index will contain documents provided by the crawler. +. Finally, follow the instructions to <>. Add the index that contains the data from your new web crawler (`threat_intelligence_feed_1` in this example). + +Your new threat intelligence data is now included in Knowledge Base and can inform AI Assistant's responses. + +Refer to the following video for an example of creating a web crawler to ingest threat intelligence data and adding it to Knowledge Base. + +======= +++++ + + +
+++++ +======= \ No newline at end of file From 286179be74ff6c05b19eb83263ba0232cebbe025 Mon Sep 17 00:00:00 2001 From: Benjamin Ironside Goldstein Date: Mon, 18 Nov 2024 13:37:11 -0800 Subject: [PATCH 2/6] various minor edits --- docs/AI-for-security/knowledge-base.asciidoc | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/AI-for-security/knowledge-base.asciidoc b/docs/AI-for-security/knowledge-base.asciidoc index a2b8340647..6f66870d8f 100644 --- a/docs/AI-for-security/knowledge-base.asciidoc +++ b/docs/AI-for-security/knowledge-base.asciidoc @@ -143,7 +143,7 @@ Refer to the following video for an example of adding an index to Knowledge Base You can use an {es} connector or web crawler to create an index that contains data you want to add to Knowledge Base. -This section provides an example of adding a threat intelligence feed to Knowledge Base using a web crawler. For more information on adding data to {es} using a connector, refer to {ref}/es-connectors.html[Ingest data with Elastic connectors]. For more information on web crawlers, refer to {ref}/crawler.html[Elastic web crawler]. +This section provides an example of adding a threat intelligence feed to Knowledge Base using a web crawler. For more information on adding data to {es} using a connector, refer to {ref}/es-connectors.html[Ingest data with Elastic connectors]. For more information on web crawlers, refer to {enterprise-search-ref}/crawler.html[Elastic web crawler]. [discrete] ==== Use a web crawler to add threat intelligence to Knowledge Base @@ -154,12 +154,12 @@ First, you'll need to set up a web crawler to add the desired data to an index, . Click **New web crawler**. .. Under **Index name**, name the index where the data from your new web crawler will be stored, for example `threat_intelligence_feed_1`. Click **Create index**. .. Under **Domain URL**, enter the URL where the web crawler should collect data. Click **Validate Domain** to test it, then **Add domain**. -. The previous step opens a page with the details of your new crawler. Go to its **Mappings** tab, then click **Add field**. +. The previous step opens a page with the details of your new index. Go to its **Mappings** tab, then click **Add field**. + NOTE: Remember, each index added to Knowledge Base must have at least one semantic text field. .. Under **Field type**, select `Semantic text`. Under **Select an inference endpoint**, select `elastic-security-ai-assistant-elser2`. Click **Add field**, then **Save mapping**. . Go to the **Scheduling** tab. Enable the **Enable recurring crawls with the following schedule** setting, and define your desired schedule. -. Go to the **Manage Domains** tab. Select the domain associated with your new web crawler, then go the its **Crawl rules** tab and click **Add crawl rule**. +. Go to the **Manage Domains** tab. Select the domain associated with your new web crawler, then go the its **Crawl rules** tab and click **Add crawl rule**. For more information, refer to {enterprise-search-ref}/crawler-extraction-rules.html[Web crawler content extraction rules]. .. Under **Policy**, select `Allow`. Under **Rule**, select `Contains`. Under **Path pattern**, enter your path pattern, for example `threat-intelligence`. Click **Save**. .. Click **Add crawl rule** again. Under **Policy**, select `Disallow`. Under **Rule**, select `Regex`. Under **Path pattern**, enter `.*`. Click **Save**. .. Click **Crawl**, then **Crawl all domains on this index**. A message appears that says "Successfully scheduled a sync, waiting for a connector to pick it up". From 5fad1637d01b46dcc44d85376e118f65475546c0 Mon Sep 17 00:00:00 2001 From: Benjamin Ironside Goldstein Date: Mon, 18 Nov 2024 14:15:41 -0800 Subject: [PATCH 3/6] fixes note bug --- docs/AI-for-security/knowledge-base.asciidoc | 1 + 1 file changed, 1 insertion(+) diff --git a/docs/AI-for-security/knowledge-base.asciidoc b/docs/AI-for-security/knowledge-base.asciidoc index 6f66870d8f..49d9c6cc43 100644 --- a/docs/AI-for-security/knowledge-base.asciidoc +++ b/docs/AI-for-security/knowledge-base.asciidoc @@ -157,6 +157,7 @@ First, you'll need to set up a web crawler to add the desired data to an index, . The previous step opens a page with the details of your new index. Go to its **Mappings** tab, then click **Add field**. + NOTE: Remember, each index added to Knowledge Base must have at least one semantic text field. ++ .. Under **Field type**, select `Semantic text`. Under **Select an inference endpoint**, select `elastic-security-ai-assistant-elser2`. Click **Add field**, then **Save mapping**. . Go to the **Scheduling** tab. Enable the **Enable recurring crawls with the following schedule** setting, and define your desired schedule. . Go to the **Manage Domains** tab. Select the domain associated with your new web crawler, then go the its **Crawl rules** tab and click **Add crawl rule**. For more information, refer to {enterprise-search-ref}/crawler-extraction-rules.html[Web crawler content extraction rules]. From f5bda4a1da50499bd18507f1648323870587db4b Mon Sep 17 00:00:00 2001 From: Benjamin Ironside Goldstein Date: Tue, 19 Nov 2024 12:33:19 -0800 Subject: [PATCH 4/6] incorporates James feedback --- docs/AI-for-security/knowledge-base.asciidoc | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/AI-for-security/knowledge-base.asciidoc b/docs/AI-for-security/knowledge-base.asciidoc index 49d9c6cc43..c696a1a435 100644 --- a/docs/AI-for-security/knowledge-base.asciidoc +++ b/docs/AI-for-security/knowledge-base.asciidoc @@ -154,7 +154,7 @@ First, you'll need to set up a web crawler to add the desired data to an index, . Click **New web crawler**. .. Under **Index name**, name the index where the data from your new web crawler will be stored, for example `threat_intelligence_feed_1`. Click **Create index**. .. Under **Domain URL**, enter the URL where the web crawler should collect data. Click **Validate Domain** to test it, then **Add domain**. -. The previous step opens a page with the details of your new index. Go to its **Mappings** tab, then click **Add field**. +. The previous step opens a page with the details of your new index. Go to its **Mappings** tab, then click **Add field**. + NOTE: Remember, each index added to Knowledge Base must have at least one semantic text field. + @@ -162,7 +162,7 @@ NOTE: Remember, each index added to Knowledge Base must have at least one semant . Go to the **Scheduling** tab. Enable the **Enable recurring crawls with the following schedule** setting, and define your desired schedule. . Go to the **Manage Domains** tab. Select the domain associated with your new web crawler, then go the its **Crawl rules** tab and click **Add crawl rule**. For more information, refer to {enterprise-search-ref}/crawler-extraction-rules.html[Web crawler content extraction rules]. .. Under **Policy**, select `Allow`. Under **Rule**, select `Contains`. Under **Path pattern**, enter your path pattern, for example `threat-intelligence`. Click **Save**. -.. Click **Add crawl rule** again. Under **Policy**, select `Disallow`. Under **Rule**, select `Regex`. Under **Path pattern**, enter `.*`. Click **Save**. +.. Click **Add crawl rule** again. Under **Policy**, select `Disallow`. Under **Rule**, select `Regex`. Under **Path pattern**, enter `.*`. Click **Save**. Make sure this rule appears below the rule created in the previous step on the list. .. Click **Crawl**, then **Crawl all domains on this index**. A message appears that says "Successfully scheduled a sync, waiting for a connector to pick it up". . The crawl process will take longer for larger data sources. Once it finishes, your new web crawler's index will contain documents provided by the crawler. . Finally, follow the instructions to <>. Add the index that contains the data from your new web crawler (`threat_intelligence_feed_1` in this example). From 38396dd342d5adcb798e536d6f5cbcfec15e8fef Mon Sep 17 00:00:00 2001 From: Benjamin Ironside Goldstein Date: Tue, 19 Nov 2024 15:03:48 -0800 Subject: [PATCH 5/6] incorporates Charles' review --- docs/AI-for-security/knowledge-base.asciidoc | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/AI-for-security/knowledge-base.asciidoc b/docs/AI-for-security/knowledge-base.asciidoc index c696a1a435..34dbf268a1 100644 --- a/docs/AI-for-security/knowledge-base.asciidoc +++ b/docs/AI-for-security/knowledge-base.asciidoc @@ -161,8 +161,8 @@ NOTE: Remember, each index added to Knowledge Base must have at least one semant .. Under **Field type**, select `Semantic text`. Under **Select an inference endpoint**, select `elastic-security-ai-assistant-elser2`. Click **Add field**, then **Save mapping**. . Go to the **Scheduling** tab. Enable the **Enable recurring crawls with the following schedule** setting, and define your desired schedule. . Go to the **Manage Domains** tab. Select the domain associated with your new web crawler, then go the its **Crawl rules** tab and click **Add crawl rule**. For more information, refer to {enterprise-search-ref}/crawler-extraction-rules.html[Web crawler content extraction rules]. -.. Under **Policy**, select `Allow`. Under **Rule**, select `Contains`. Under **Path pattern**, enter your path pattern, for example `threat-intelligence`. Click **Save**. -.. Click **Add crawl rule** again. Under **Policy**, select `Disallow`. Under **Rule**, select `Regex`. Under **Path pattern**, enter `.*`. Click **Save**. Make sure this rule appears below the rule created in the previous step on the list. +.. Click **Add crawl rule** again. Under **Policy**, select `Disallow`. Under **Rule**, select `Regex`. Under **Path pattern**, enter `.*`. Click **Save**. +.. Under **Policy**, select `Allow`. Under **Rule**, select `Contains`. Under **Path pattern**, enter your path pattern, for example `threat-intelligence`. Click **Save**. Make sure this rule appears below the rule created in the previous step on the list. .. Click **Crawl**, then **Crawl all domains on this index**. A message appears that says "Successfully scheduled a sync, waiting for a connector to pick it up". . The crawl process will take longer for larger data sources. Once it finishes, your new web crawler's index will contain documents provided by the crawler. . Finally, follow the instructions to <>. Add the index that contains the data from your new web crawler (`threat_intelligence_feed_1` in this example). From 6cffa0e51d8e53c055d36a10a2af7303684ece6b Mon Sep 17 00:00:00 2001 From: Benjamin Ironside Goldstein Date: Wed, 20 Nov 2024 08:28:43 -0800 Subject: [PATCH 6/6] incorporates review --- docs/AI-for-security/knowledge-base.asciidoc | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/docs/AI-for-security/knowledge-base.asciidoc b/docs/AI-for-security/knowledge-base.asciidoc index 34dbf268a1..e345df14be 100644 --- a/docs/AI-for-security/knowledge-base.asciidoc +++ b/docs/AI-for-security/knowledge-base.asciidoc @@ -57,7 +57,7 @@ When Knowledge Base is enabled, AI Assistant receives `open` or `acknowledged` a To enable Knowledge Base for alerts: . Ensure that knowledge base is <>. -. Use the slider on the **Security AI settings** page's Knowledge Base tab to select the number of alerts to send to AI Assistant. Click **Save**. +. On the **Security AI settings** page, go to the **Knowledge Base** tab and use the slider to select the number of alerts to send to AI Assistant. Click **Save**. NOTE: Including a large number of alerts may cause your request to exceed the maximum token length of your third-party generative AI provider. If this happens, try selecting a lower number of alerts to send. @@ -163,8 +163,7 @@ NOTE: Remember, each index added to Knowledge Base must have at least one semant . Go to the **Manage Domains** tab. Select the domain associated with your new web crawler, then go the its **Crawl rules** tab and click **Add crawl rule**. For more information, refer to {enterprise-search-ref}/crawler-extraction-rules.html[Web crawler content extraction rules]. .. Click **Add crawl rule** again. Under **Policy**, select `Disallow`. Under **Rule**, select `Regex`. Under **Path pattern**, enter `.*`. Click **Save**. .. Under **Policy**, select `Allow`. Under **Rule**, select `Contains`. Under **Path pattern**, enter your path pattern, for example `threat-intelligence`. Click **Save**. Make sure this rule appears below the rule created in the previous step on the list. -.. Click **Crawl**, then **Crawl all domains on this index**. A message appears that says "Successfully scheduled a sync, waiting for a connector to pick it up". -. The crawl process will take longer for larger data sources. Once it finishes, your new web crawler's index will contain documents provided by the crawler. +.. Click **Crawl**, then **Crawl all domains on this index**. A success message appears. The crawl process will take longer for larger data sources. Once it finishes, your new web crawler's index will contain documents provided by the crawler. . Finally, follow the instructions to <>. Add the index that contains the data from your new web crawler (`threat_intelligence_feed_1` in this example). Your new threat intelligence data is now included in Knowledge Base and can inform AI Assistant's responses.