Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Error shows up with "InteractionId is required" #377

Open
dmantas opened this issue Dec 16, 2024 · 0 comments
Open

[BUG] Error shows up with "InteractionId is required" #377

dmantas opened this issue Dec 16, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@dmantas
Copy link

dmantas commented Dec 16, 2024

What is the bug?
I 'm trying to use the dashboards Assistant by triggering a Conversational Flow agent. The first query to the Assistant works fine, but in the second one, I always get an error: InteractionId is required.

How can one reproduce the bug?
Create an agent. I have created a conversational flow agent:

POST /_plugins/_ml/agents/_register
{
  "name": "Demo agent",
  "type": "conversational_flow",
  "description": "This is a test agent support running any search query",
  "memory": {
    "type": "conversation_index"
  },
  "app_type": "rag",
  "tools": [
    {
      "type": "SearchIndexTool",
      "parameters": {
        "input": "{\"index\": \"${parameters.index}\", \"query\": ${parameters.query} }",
        "index": "my-index",
        "query": {
          "query": {
            "match": {
              "log": "${parameters.question}"
            }
          },
          "size": 100,
          "_source": "log"
        }
      }
    },
    {
      "type": "MLModelTool",
      "description": "A general tool to answer any question",
      "name": "chatgpt_4o_mini",
      "parameters": {
        "model_id": "Jwdud5MBoxapYathgFpN",
        "prompt": "Context:\n${parameters.SearchIndexTool.output}\n\nHuman:${parameters.question}\n\nAssistant:"
      }
    }
  ]
}

And I've registered a root chatbot agent:

POST /_plugins/_ml/agents/_register
{
  "name": "Chatbot agent",
  "type": "flow",
  "description": "this is a test chatbot agent",
  "tools": [
    {
      "type": "AgentTool",
      "name": "LLMResponseGenerator",
      "parameters": {
        "agent_id": "Ibbld5MBzismXehyVPWJ" 
      },
      "include_output_in_agent_response": true
    },
    {
      "type": "MLModelTool",
      "name": "QuestionSuggestor",
      "description": "A general tool to answer any question",
      "parameters": {
        "model_id": "Jwdud5MBoxapYathgFpN",  
        "prompt": "Human:  You are an AI that only speaks JSON. Do not write normal text. Output should follow example JSON format: \n\n {\"response\": [\"question1\", \"question2\"]}\n\n. \n\nHuman:You will be given a chat history between OpenSearch Assistant and a Human.\nUse the context provided to generate follow up questions the Human would ask to the Assistant.\nThe Assistant can answer general questions about logs, traces and metrics.\nAssistant can access a set of tools listed below to answer questions given by the Human:\nQuestion suggestions generator tool\nHere's the chat history between the human and the Assistant.\n${parameters.LLMResponseGenerator.output}\nUse the following steps to generate follow up questions Human may ask after the response of the Assistant:\nStep 1. Use the chat history to understand what human is trying to search and explore.\nStep 2. Understand what capabilities the assistant has with the set of tools it has access to.\nStep 3. Use the above context and generate follow up questions.Step4:You are an AI that only speaks JSON. Do not write normal text. Output should follow example JSON format: \n\n {\"response\": [\"question1\", \"question2\"]} \n \n----------------\n\nAssistant:"
      },
      "include_output_in_agent_response": true
    }
  ],
  "memory": {
    "type": "conversation_index"
  }
}

I'm able to query the agent via the Dashboards Assistant, and I will get back a response as per the screenshot.

But any follow-up question I ask ends up in an error InteractionId is required.

Doing this query:

GET /_plugins/_ml/memory/PyJceJMBd-oTvid6Kwkw/messages

I can see the follow-up message AND the response, which leads me to believe the issue is with the Assistant.

In the Dashboards log I see:

{"type":"log","@timestamp":"2024-11-29T14:39:37Z","tags":["error","plugins","assistantDashboards"],"pid":1,"message":"Error: interactionId is required\n    at AgentFrameworkStorageService.getInteraction (/usr/share/opensearch-dashboards/plugins/assistantDashboards/server/services/storage/agent_framework_storage_service.ts:261:13)\n    at /usr/share/opensearch-dashboards/plugins/assistantDashboards/server/routes/chat_routes.ts:231:52\n    at processTicksAndRejections (node:internal/process/task_queues:95:5)\n    at Router.handle (/usr/share/opensearch-dashboards/src/core/server/http/router/router.js:174:44)\n    at handler (/usr/share/opensearch-dashboards/src/core/server/http/router/router.js:140:50)\n    at exports.Manager.execute (/usr/share/opensearch-dashboards/node_modules/@hapi/hapi/lib/toolkit.js:60:28)\n    at Object.internals.handler (/usr/share/opensearch-dashboards/node_modules/@hapi/hapi/lib/handler.js:46:20)\n    at exports.execute (/usr/share/opensearch-dashboards/node_modules/@hapi/hapi/lib/handler.js:31:20)\n    at Request._lifecycle (/usr/share/opensearch-dashboards/node_modules/@hapi/hapi/lib/request.js:371:32)\n    at Request._execute (/usr/share/opensearch-dashboards/node_modules/@hapi/hapi/lib/request.js:281:9)"}

What is the expected behavior?
I expect that responses to the follow-up is returned by the Assistant.

What is your host/environment?

  • OS: Running on K8s using Opensearch Operator

  • Version: 2.17

  • Plugins:

  • ML Commons related settings:

      "ml_commons": {
        "only_run_on_ml_node": "false",
        "rag_pipeline_feature_enabled": "true",
        "trusted_connector_endpoints_regex": [
          """^https://runtime\.sagemaker\..*[a-z0-9-]\.amazonaws\.com/.*$""",
          """^https://api\.openai\.com/.*$""",
          """^https://api\.cohere\.ai/.*$""",
          """^https://bedrock-runtime\..*[a-z0-9-]\.amazonaws\.com/.*$"""
        ],
        "agent_framework_enabled": "true",
        "native_memory_threshold": "100",
        "memory_feature_enabled": "true"

Do you have any screenshots?
Screenshot is attached.

image

Do you have any additional context?
N/A

@dmantas dmantas added bug Something isn't working untriaged Issues that have not yet been triaged labels Dec 16, 2024
@ruanyl ruanyl changed the title [BUG] [BUG] Error shows up with "InteractionId is required" Dec 16, 2024
@ruanyl ruanyl removed the untriaged Issues that have not yet been triaged label Dec 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants