Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: chat engine based on agents #909

Closed
wants to merge 2 commits into from

Conversation

parhammmm
Copy link
Contributor

@marcusschiesser this one is the other engine we talked about.

Includes a fix for chat history in agents

  const agent = new OpenAIAgent({
    llm,
    tools: [...],
    systemPrompt: "..."
  });
  const engine = new AgentChatEngine({ agent });
  const result = await engine.chat({
    message: [
      {
        type: "text",
        text: "hi",
      },
    ],
    stream: false,
  });

Copy link

changeset-bot bot commented Jun 5, 2024

🦋 Changeset detected

Latest commit: 996b181

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 10 packages
Name Type
llamaindex Patch
docs Patch
@llamaindex/community Patch
@llamaindex/experimental Patch
@llamaindex/cloudflare-worker-agent-test Patch
@llamaindex/next-agent-test Patch
@llamaindex/nextjs-edge-runtime-test Patch
@llamaindex/waku-query-engine-test Patch
@llamaindex/autotool-01-node-example Patch
@llamaindex/autotool-02-next-example Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

Copy link

vercel bot commented Jun 5, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
llama-index-ts-docs ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 5, 2024 5:26pm

@marcusschiesser
Copy link
Collaborator

as AgentRunner already implements ChatEngine every agent already is a chat engine. Why do we need a new chat engine then? If we need the chat history, how about adding it to the agent implementation?

@parhammmm
Copy link
Contributor Author

parhammmm commented Jun 10, 2024

as AgentRunner already implements ChatEngine every agent already is a chat engine. Why do we need a new chat engine then? If we need the chat history, how about adding it to the agent implementation?

@marcusschiesser it felt safer, one reason was the output reasponse/streams of ChatEngines and Agents aren't compatible, so this engine is a way to standardise it to match the other engines.

I also couldn't traverse exactly what happens, but the Agents seem to keep a history for internal usage, would help to clarify what the history on the Agent does

Are you thinking that the sources part of the Agent response is lost?

@marcusschiesser
Copy link
Collaborator

@parhammmm @himself65 i guess we should then better unify the output reasponse/streams of ChatEngines and Agents?

@parhammmm
Copy link
Contributor Author

parhammmm commented Jun 11, 2024

@marcusschiesser I'm not confident I can do it myself safely but I can take a stab at it and sync with @himself65 for notes/review.

Would it be possible in the meantime to release AgentChatEngine in the community package, if it is I can move it and then take the above

@parhammmm
Copy link
Contributor Author

@marcusschiesser what do you think?

@marcusschiesser
Copy link
Collaborator

@himself65 parhammmm is proposing the new agent chat engine for two reasons (instead of just using the AgentRunner):

  1. To have an agent chat engine using ChatResponse like the other chat engines instead of AgentChatResponse
  2. To pass a chat history to the agents

About 1. I think it would be good to use ChatResponse in the AgentRunner too and get rid of AgentChatResponse - I'll start a PR to open the discussion for this breaking change - @parhammmm you can look at my workaround at https://github.com/run-llama/create-llama/blob/main/templates/types/streaming/nextjs/app/api/chat/llamaindex-stream.ts for a workaround dealing with this problem
About 2. Seems to be a bug, I'll have a look into this

@parhammmm
Copy link
Contributor Author

parhammmm commented Jun 14, 2024

@marcusschiesser thanks for taking a look. part of the fix for 2 is actually in this PR and can be used to match the other engines here: https://github.com/run-llama/LlamaIndexTS/pull/909/files#diff-e722a872695a5840308e8f2f844e6a3d6f25969880fe3e670ae5797ac38a9724R289

What do you think of doing the two patches for 1 & 2 in two different PRs? The history bug is the most important and it would help unlock making a gpt style chat, which is what I'm actually using AgentChatEngine for.

Thanks again.

@marcusschiesser
Copy link
Collaborator

@parhammmm great, yes 1 & 2 should be one PR each. If you're blocked by 2, would you like to send a PR for it by changing the AgentRunner directly?

@parhammmm
Copy link
Contributor Author

@marcusschiesser sure thing will get to it shortly

@marcusschiesser
Copy link
Collaborator

@himself65 and @parhammmm i started a draft PR how to solve problem 1: #930 - please leave your comments

@parhammmm
Copy link
Contributor Author

parhammmm commented Jun 17, 2024

@marcusschiesser @himself65 I'm not deep enough into LITS to be able to give good feed back on 930, but I pushed the history fix to this PR #933

And as a work around for the agent's responses I'm doing the following on our side, which translates to adding a utility that handles the merging of the two types (just incase it's helpful)

import { Response } from "llamaindex";
import { ReadableStream } from "node:stream/web";

type LlamaindexAgentResponse = { response: { delta: string } };
type LlamaindexResponse = Response | LlamaindexAgentResponse;

...
// agent or chat engine
const response: AsyncIterable<Response> | ReadableStream<LlamaindexAgentResponse> = await agent.chat({ ... })

...
// value is from reading the response stream
const result =
  typeof value.response === "string"
      ? value.response
      : value.response.delta;

@parhammmm
Copy link
Contributor Author

closing this in favour of the #933 and #930

@parhammmm parhammmm closed this Jun 17, 2024
@parhammmm parhammmm deleted the feature-agent-chat-engine branch June 17, 2024 17:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants