Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(tools): use agent's memory within LLM Tool #242

Merged
merged 5 commits into from
Dec 11, 2024
Merged

Conversation

Tomas2D
Copy link
Contributor

@Tomas2D Tomas2D commented Dec 10, 2024

Features

  • Propagate the agent's (runner) memory to the Tool's Run context
  • Replace old (not used) LLMTool with a new one that uses provided LLM to accomplish agent's intent (it uses a custom system prompt + messages from the memory)
new LLMTool({
  llm: BAMChatLLM.fromPreset("meta-llama/llama-3-8b-instruct"),
  name: "LLM", // optional
  description: "Uses expert LLM to work with data in the existing conversation (classification, entity extraction, summarization, ...)", // optional
  template?: LLMTool.template // optional (system prompt)
})

Standalone Tool Example (uses Ollama)

yarn start examples/tools/llm.ts

Agent Example

import "dotenv/config";
import { BeeAgent } from "bee-agent-framework/agents/bee/agent";
import { createConsoleReader } from "examples/helpers/io.js";
import { FrameworkError } from "bee-agent-framework/errors";
import { UnconstrainedMemory } from "bee-agent-framework/memory/unconstrainedMemory";
import { BAMChatLLM } from "bee-agent-framework/adapters/bam/chat";
import { WikipediaTool } from "bee-agent-framework/tools/search/wikipedia";
import { LLMTool } from "bee-agent-framework/tools/llm";

const agent = new BeeAgent({
  llm: BAMChatLLM.fromPreset("meta-llama/llama-3-1-70b-instruct"),
  memory: new UnconstrainedMemory(),
  tools: [
    new LLMTool({
      llm: BAMChatLLM.fromPreset("meta-llama/llama-3-8b-instruct"),
    }),
    new WikipediaTool(),
  ],
});

const reader = createConsoleReader();

try {
  for await (const { prompt } of reader) {
    const response = await agent
      .run({
        prompt,
      })
      .observe((emitter) => {
        emitter.on("retry", () => {
          reader.write(`Agent 🤖 : `, "retrying the action...");
        });
        emitter.on("update", async ({ update }) => {
          reader.write(`Agent (${update.key}) 🤖 : `, `${update.value}`);
        });
      });

    reader.write(`Agent 🤖 : `, response.result.text);
  }
} catch (error) {
  reader.write("ERROR", FrameworkError.ensure(error).dump());
}

TODOs:

  • evaluate

cc @aleskalfas, @michael-desmond

Ref: #184
Signed-off-by: Tomas Dvorak <toomas2d@gmail.com>
Copy link
Contributor

@michael-desmond michael-desmond left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. Minor typos in prompts.

src/tools/llm.ts Outdated Show resolved Hide resolved
src/tools/llm.ts Outdated Show resolved Hide resolved
Signed-off-by: Tomas Dvorak <toomas2d@gmail.com>
Signed-off-by: Tomas Dvorak <toomas2d@gmail.com>
Signed-off-by: Tomas Dvorak <toomas2d@gmail.com>
@aleskalfas
Copy link
Contributor

Great 👍

@Tomas2D Tomas2D requested a review from aleskalfas December 11, 2024 11:08
Signed-off-by: Tomas Dvorak <toomas2d@gmail.com>
@Tomas2D Tomas2D enabled auto-merge (squash) December 11, 2024 11:16
@Tomas2D Tomas2D disabled auto-merge December 11, 2024 11:18
@Tomas2D Tomas2D enabled auto-merge (squash) December 11, 2024 11:19
@Tomas2D Tomas2D disabled auto-merge December 11, 2024 11:19
@Tomas2D Tomas2D merged commit 0407c66 into main Dec 11, 2024
4 checks passed
@Tomas2D Tomas2D deleted the feat/184-llm-tool branch December 11, 2024 11:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants