Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Integrate with agent/tool agent from go-llm #896

Open
nkwangleiGIT opened this issue Mar 19, 2024 · 3 comments
Open

feat: Integrate with agent/tool agent from go-llm #896

nkwangleiGIT opened this issue Mar 19, 2024 · 3 comments
Milestone

Comments

@nkwangleiGIT
Copy link
Contributor

Check if any assets we can reuse from https://github.com/natexcvi/go-llm.

@Abirdcfly
Copy link
Collaborator

It's a pretty creative program. But only OpenAI's GPT chat completion API is supported. and use Model-Native Function Calls, the doc is https://platform.openai.com/docs/api-reference/chat/create#chat-create-function_call
one example is:

curl https://api.openai.com/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
  "model": "gpt-3.5-turbo",
  "messages": [
    {
      "role": "user",
      "content": "What is the weather like in Boston?"
    }
  ],
  "tools": [
    {
      "type": "function",
      "function": {
        "name": "get_current_weather",
        "description": "Get the current weather in a given location",
        "parameters": {
          "type": "object",
          "properties": {
            "location": {
              "type": "string",
              "description": "The city and state, e.g. San Francisco, CA"
            },
            "unit": {
              "type": "string",
              "enum": ["celsius", "fahrenheit"]
            }
          },
          "required": ["location"]
        }
      }
    }
  ],
  "tool_choice": "auto"
}'

the output will be

curl https://api.openai.com/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
  "model": "gpt-3.5-turbo",
  "messages": [
    {
      "role": "user",
      "content": "What is the weather like in Boston?"
    }
  ],
  "tools": [
    {
      "type": "function",
      "function": {
        "name": "get_current_weather",
        "description": "Get the current weather in a given location",
        "parameters": {
          "type": "object",
          "properties": {
            "location": {
              "type": "string",
              "description": "The city and state, e.g. San Francisco, CA"
            },
            "unit": {
              "type": "string",
              "enum": ["celsius", "fahrenheit"]
            }
          },
          "required": ["location"]
        }
      }
    }
  ],
  "tool_choice": "auto"
}'

from lm-sys/FastChat #2214 fastchat is not support it yet. And zhipuAI v4 support function_call.

@Abirdcfly
Copy link
Collaborator

目前只有 ChatGLM3-6B 模型支持工具调用,而 ChatGLM3-6B-Base 和 ChatGLM3-6B-32K 模型不支持。
https://github.com/THUDM/ChatGLM3/blob/main/tools_using_demo/README.md

@Abirdcfly
Copy link
Collaborator

Qwen-Chat 的套路和我们现在的一致,使用react方法,把工具写在prompt里:
https://github.com/QwenLM/Qwen/blob/main/examples/react_prompt.md

基于该原理,我们在 openai_api.py 里提供了函数调用(Function Calling)的支持。

感觉如果我们也想让FastChat也实现OpenAI标准的 function_call 方法,也可以这么做。

@bjwswang bjwswang added this to the v0.4.0 milestone Apr 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants