-
Notifications
You must be signed in to change notification settings - Fork 82
Closed
xu-chris/req_llm
#2Description
Bug Description
When using Mistral's API through the OpenAI-compatible format, tool calls are not being parsed because decode_openai_tool_call/1 requires a "type" => "function" field that Mistral does not include in its responses.
Environment
- ReqLLM version: 0.4.0
- Elixir: 1.19.1
- Provider: Mistral (using
use ReqLLM.Provider)
Steps to Reproduce
- Configure a Mistral provider using the OpenAI-compatible format
- Make a request with tools defined
- LLM returns tool calls
- Tool calls are not parsed -
response.message.tool_callsisnildespitefinish_reason: :tool_calls
Root Cause
In lib/req_llm/provider/defaults.ex, the decode_openai_tool_call/1 function at ~line 877 requires "type" => "function":
defp decode_openai_tool_call(%{
"id" => id,
"type" => "function", # <-- REQUIRED
"function" => %{"name" => name, "arguments" => args_json}
}) do
case Jason.decode(args_json || "{}") do
{:ok, args} -> ReqLLM.StreamChunk.tool_call(name, args, %{id: id})
{:error, _} -> nil
end
end
defp decode_openai_tool_call(_), do: nil # Falls through here for MistralMistral API returns:
%{
"function" => %{"arguments" => "{\"city\": \"Paris\"}", "name" => "get_weather"},
"id" => "lVauww8VE",
"index" => 0
# NO "type" field!
}OpenAI API returns:
%{
"function" => %{"arguments" => "{\"city\":\"Paris\"}", "name" => "get_weather"},
"id" => "call_...",
"type" => "function" # HAS "type" field
}Since Mistral omits the "type" field, the pattern match fails and falls through to the catch-all clause returning nil.
Proposed Fix
Add a fallback pattern that doesn't require the "type" field. This is backwards-compatible since it will only match when the stricter pattern fails:
# Existing pattern for OpenAI and others that include "type"
defp decode_openai_tool_call(%{
"id" => id,
"type" => "function",
"function" => %{"name" => name, "arguments" => args_json}
}) do
case Jason.decode(args_json || "{}") do
{:ok, args} -> ReqLLM.StreamChunk.tool_call(name, args, %{id: id})
{:error, _} -> nil
end
end
# Fallback for Mistral and others that omit "type" field
defp decode_openai_tool_call(%{
"id" => id,
"function" => %{"name" => name, "arguments" => args_json}
}) do
case Jason.decode(args_json || "{}") do
{:ok, args} -> ReqLLM.StreamChunk.tool_call(name, args, %{id: id})
{:error, _} -> nil
end
end
defp decode_openai_tool_call(_), do: nilWhy this belongs in ReqLLM defaults
- This is the generic OpenAI-compatible decoder used by multiple providers
- Other providers claiming OpenAI compatibility might have the same issue
- The fix is backwards-compatible - existing OpenAI responses still work
- Putting this in individual provider overrides would require duplicating the entire decode pipeline
I'm happy to submit a PR with this fix if that would be helpful.
Metadata
Metadata
Assignees
Labels
No labels