Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: function calling not working for ollama/gemma:7b #2209

Closed
code959437957 opened this issue Feb 27, 2024 · 12 comments · Fixed by #3469
Closed

[Bug]: function calling not working for ollama/gemma:7b #2209

code959437957 opened this issue Feb 27, 2024 · 12 comments · Fixed by #3469
Labels
bug Something isn't working

Comments

@code959437957
Copy link

What happened?

A bug happened!

it return the result

{
    "id": "chatcmpl-10202250-6285-41fd-896e-ff11d7982d69",
    "choices": [
        {
            "finish_reason": "stop",
            "index": 0,
            "message": {
                "content": null,
                "role": "assistant",
                "tool_calls": [
                    {
                        "id": "call_a789c755-cbfc-4abf-b9e7-434c64608211",
                        "function": {
                            "arguments": "{\n  \"name\": \"get_current_weather\",\n  \"description\": \"Get the current weather in a given location\",\n  \"parameters\": {\n    \"type\": \"object\",\n    \"properties\": {\n      \"location\": {\n        \"type\": \"string\",\n        \"description\": \"The city and state, e.g. San Francisco, CA\"\n      },\n      \"unit\": {\n        \"type\": \"string\",\n        \"enum\": [\"celsius\", \"fahrenheit\"]\n      }\n    },\n    \"required\": [\"location\"]\n  }\n}",
                            "name": ""
                        },
                        "type": "function"
                    }
                ]
            }
        }
    ],
    "created": 1708994428,
    "model": "ollama/gemma:7b",
    "object": "chat.completion",
    "system_fingerprint": null,
    "usage": {
        "prompt_tokens": 119,
        "completion_tokens": 132,
        "total_tokens": 251
    }
}

Relevant log output

The start CLI

$ litellm --model ollama/gemma:7b --api_base http://localhost:11434   

The we call it via curl

$ curl --location 'http://192.168.0.27:8000/v1/chat/completions' \
--header 'Content-Type: application/json' \
--data '{
  "model": "ollama/gemma",
  "messages": [
    {"role": "user", "content": "What'\''s the weather like in San Francisco"}
  ],
  "functions": [
    {
      "name": "get_current_weather",
      "description": "Get the current weather in a given location",
      "parameters": {
        "type": "object",
        "properties": {
          "location": {
            "type": "string",
            "description": "The city and state, e.g. San Francisco, CA"
          },
          "unit": {
            "type": "string",
            "enum": ["celsius", "fahrenheit"]
          }
        },
        "required": ["location"]
      }
    }
  ]
}'

{"id":"chatcmpl-1ad50ff9-1817-4271-9bd0-43cbb7dc839f","choices":[{"finish_reason":"stop","index":0,"message":{"content":null,"role":"assistant","tool_calls":[{"id":"call_097a25c8-6c60-49ad-a5be-8549e6f00eb0","function":{"arguments":"{\n  \"name\": \"get_current_weather\",\n  \"description\": \"Get the current weather in a given location\",\n  \"parameters\": {\n    \"type\": \"object\",\n    \"properties\": {\n      \"location\": {\n        \"type\": \"string\",\n        \"description\": \"The city and state, e.g. San Francisco, CA\"\n      },\n      \"unit\": {\n        \"type\": \"string\",\n        \"enum\": [\"celsius\", \"fahrenheit\"]\n      }\n    },\n    \"required\": [\"location\"]\n  }\n}","name":""},"type":"function"}]}}],"created":1708995801,"model":"ollama/gemma:7b","object":"chat.completion","system_fingerprint":null,"usage":{"prompt_tokens":123,"completion_tokens":132,"total_tokens":255}}


### Twitter / LinkedIn details

_No response_
@code959437957 code959437957 added the bug Something isn't working label Feb 27, 2024
@krrishdholakia
Copy link
Contributor

krrishdholakia commented Feb 27, 2024

What's the error? @code959437957

this look like it worked:

{"id":"chatcmpl-1ad50ff9-1817-4271-9bd0-43cbb7dc839f","choices":[{"finish_reason":"stop","index":0,"message":{"content":null,"role":"assistant","tool_calls":[{"id":"call_097a25c8-6c60-49ad-a5be-8549e6f00eb0","function":{"arguments":"{\n "name": "get_current_weather",\n "description": "Get the current weather in a given location",\n "parameters": {\n "type": "object",\n "properties": {\n "location": {\n "type": "string",\n "description": "The city and state, e.g. San Francisco, CA"\n },\n "unit": {\n "type": "string",\n "enum": ["celsius", "fahrenheit"]\n }\n },\n "required": ["location"]\n }\n}","name":""},"type":"function"}]}}],"created":1708995801,"model":"ollama/gemma:7b","object":"chat.completion","system_fingerprint":null,"usage":{"prompt_tokens":123,"completion_tokens":132,"total_tokens":255}}

@code959437957
Copy link
Author

What's the error? @code959437957

this look like it worked:

{"id":"chatcmpl-1ad50ff9-1817-4271-9bd0-43cbb7dc839f","choices":[{"finish_reason":"stop","index":0,"message":{"content":null,"role":"assistant","tool_calls":[{"id":"call_097a25c8-6c60-49ad-a5be-8549e6f00eb0","function":{"arguments":"{\n "name": "get_current_weather",\n "description": "Get the current weather in a given location",\n "parameters": {\n "type": "object",\n "properties": {\n "location": {\n "type": "string",\n "description": "The city and state, e.g. San Francisco, CA"\n },\n "unit": {\n "type": "string",\n "enum": ["celsius", "fahrenheit"]\n }\n },\n "required": ["location"]\n }\n}","name":""},"type":"function"}]}}],"created":1708995801,"model":"ollama/gemma:7b","object":"chat.completion","system_fingerprint":null,"usage":{"prompt_tokens":123,"completion_tokens":132,"total_tokens":255}}

I don't know it is a problem about litellm, or Gemma model.

the return should be like

{
  "id": "chatcmpl-123",
  "...": "...",
  "choices": [{
    "index": 0,
    "message": {
      "role": "assistant",
      "content": null,
      "function_call": {
        "name": "get_current_weather",
        "arguments": "{ \"location\": \"Boston, MA\"}"
      }
    },
    "finish_reason": "function_call"
  }]
}

do you see it ? the chatgpt will response right function call with arguments: location just Boston,MA

@code959437957
Copy link
Author

It may be a lost feature of Google Gemma mode, see the original thread:
https://huggingface.co/google/gemma-7b/discussions/38

@ChristianWeyer
Copy link

ChristianWeyer commented Mar 19, 2024

Actually, I see this with any function-capable model with the lastest LiteLLM model @krrishdholakia.
I tried several (actually, I did not try Gemma😉).

Always we get that wrong nested result like described above.

Please also see ShishirPatil/gorilla#247 (comment)

@krrishdholakia
Copy link
Contributor

krrishdholakia commented Mar 19, 2024

Looking at this on openai's site - https://platform.openai.com/docs/api-reference/chat/create
Screenshot 2024-03-19 at 11 41 13 AM

{
  "id": "chatcmpl-abc123",
  "object": "chat.completion",
  "created": 1699896916,
  "model": "gpt-3.5-turbo-0125",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": null,
        "tool_calls": [
          {
            "id": "call_abc123",
            "type": "function",
            "function": {
              "name": "get_current_weather",
              "arguments": "{\n\"location\": \"Boston, MA\"\n}"
            }
          }
        ]
      },
      "logprobs": null,
      "finish_reason": "tool_calls"
    }
  ],
  "usage": {
    "prompt_tokens": 82,
    "completion_tokens": 17,
    "total_tokens": 99
  }
}

I believe this format is being followed, unless i'm missing something?

If i'm wrong can someone share a formatted litellm response vs. the expected one?

cc: @ChristianWeyer @code959437957

@ChristianWeyer
Copy link

Not quite @krrishdholakia
This is what I see via LiteLLM:

"tool_calls" : [
   {
      "function" : {
         "arguments" : "{\n    \"name\": \"get_current_weather\", \n    \"arguments\": {\"location\": \"Boston, MA\"}\n}\n",
         "name" : ""
      },
      "id" : "call_7e88f79b-b4d7-4f42-8c0d-363414ff6e08",
      "type" : "function"
   }
]

name is empty.
And the actual response is nested inside arguments.

Subtle ☺️.

@ChristianWeyer
Copy link

ChristianWeyer commented Mar 19, 2024

Maybe the issue is here @krrishdholakia:

"function": {"arguments": response_json["response"], "name": ""},

"function": {

@ChristianWeyer
Copy link

BTW: I also think that it should be
"finish_reason" : "tool_calls"

With LiteLLM it is "stop".

@jackmpcollins
Copy link
Contributor

I think this was resolved in v1.35.34+ by PR #1526 as discussed in related issue #3333 . Requires using the ollama_chat/ prefix in place of ollama/. Streaming responses remain broken.

@ChristianWeyer
Copy link

I think this was resolved in v1.35.34+ by PR #1526 as discussed in related issue #3333 . Requires using the ollama_chat/ prefix in place of ollama/. Streaming responses remain broken.

Thanks for the heads-up!
That PR does not fix the wrong finish_reason issue, however.

... still wondering why my PR had not been accepted ... @krrishdholakia

@krrishdholakia
Copy link
Contributor

Hey @ChristianWeyer Which PR are you referring to? I might've missed it.

We have finish reason mapping here -

def map_finish_reason(

@ChristianWeyer
Copy link

Hey @ChristianWeyer Which PR are you referring to? I might've missed it.

We have finish reason mapping here -

def map_finish_reason(

This:
#2597

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
4 participants