Skip to content

Latest commit

 

History

History
80 lines (51 loc) · 2.96 KB

File metadata and controls

80 lines (51 loc) · 2.96 KB
description
LangChain Memory Nodes

Memory


Memory allow you to chat with AI as if AI has the memory of previous conversations.

Human: hi i am bob

AI: Hello Bob! It's nice to meet you. How can I assist you today?

Human: what's my name?

AI: Your name is Bob, as you mentioned earlier.

Under the hood, these conversations are stored in arrays or databases, and provided as context to LLM. For example:

You are an assistant to a human, powered by a large language model trained by OpenAI.

Whether the human needs help with a specific question or just wants to have a conversation about a particular topic, you are here to assist.

Current conversation:
{history}

Memory Nodes:

Separate conversations for multiple users

UI & Embedded Chat

By default, UI and Embedded Chat will automatically separate different users conversations. This is done by generating a unique chatId for each new interaction. That logic is handled under the hood by Flowise.

Prediction API

You can separate the conversations for multiple users by specifying a unique sessionId

  1. For every memory node, you should be able to see a input parameter Session ID

  1. In the /api/v1/prediction/{your-chatflowid} POST body request, specify the sessionId in overrideConfig
{
    "question": "hello!",
    "overrideConfig": {
        "sessionId": "user1"
    }
}

Message API

  • GET /api/v1/chatmessage/{your-chatflowid}
  • DELETE /api/v1/chatmessage/{your-chatflowid}
Query ParamTypeValue
sessionIdstring
sortenumASC or DESC
startDatestring
endDatestring

All conversations can be visualized and managed from UI as well:

For OpenAI Assistant, Threads will be used to store conversations.