-
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: function calling not working for ollama/gemma:7b #2209
Comments
What's the error? @code959437957 this look like it worked:
|
I don't know it is a problem about litellm, or Gemma model. the return should be like
do you see it ? the chatgpt will response right function call with arguments: location just |
It may be a lost feature of Google Gemma mode, see the original thread: |
Actually, I see this with any function-capable model with the lastest LiteLLM model @krrishdholakia. Always we get that wrong nested result like described above. Please also see ShishirPatil/gorilla#247 (comment) |
Looking at this on openai's site - https://platform.openai.com/docs/api-reference/chat/create
I believe this format is being followed, unless i'm missing something? If i'm wrong can someone share a formatted litellm response vs. the expected one? |
Not quite @krrishdholakia "tool_calls" : [
{
"function" : {
"arguments" : "{\n \"name\": \"get_current_weather\", \n \"arguments\": {\"location\": \"Boston, MA\"}\n}\n",
"name" : ""
},
"id" : "call_7e88f79b-b4d7-4f42-8c0d-363414ff6e08",
"type" : "function"
}
]
Subtle |
Maybe the issue is here @krrishdholakia: litellm/litellm/llms/ollama.py Line 221 in 4913ad4
litellm/litellm/llms/ollama.py Line 318 in 4913ad4
|
BTW: I also think that it should be With LiteLLM it is |
Thanks for the heads-up! ... still wondering why my PR had not been accepted ... @krrishdholakia |
Hey @ChristianWeyer Which PR are you referring to? I might've missed it. We have finish reason mapping here - Line 188 in 918367c
|
This: |
What happened?
A bug happened!
it return the result
Relevant log output
The we call it via curl
The text was updated successfully, but these errors were encountered: