Skip to content

Mistral tool_calls not parsed - missing 'type' field in API response #372

@xu-chris

Description

@xu-chris

Bug Description

When using Mistral's API through the OpenAI-compatible format, tool calls are not being parsed because decode_openai_tool_call/1 requires a "type" => "function" field that Mistral does not include in its responses.

Environment

  • ReqLLM version: 0.4.0
  • Elixir: 1.19.1
  • Provider: Mistral (using use ReqLLM.Provider)

Steps to Reproduce

  1. Configure a Mistral provider using the OpenAI-compatible format
  2. Make a request with tools defined
  3. LLM returns tool calls
  4. Tool calls are not parsed - response.message.tool_calls is nil despite finish_reason: :tool_calls

Root Cause

In lib/req_llm/provider/defaults.ex, the decode_openai_tool_call/1 function at ~line 877 requires "type" => "function":

defp decode_openai_tool_call(%{
       "id" => id,
       "type" => "function",  # <-- REQUIRED
       "function" => %{"name" => name, "arguments" => args_json}
     }) do
  case Jason.decode(args_json || "{}") do
    {:ok, args} -> ReqLLM.StreamChunk.tool_call(name, args, %{id: id})
    {:error, _} -> nil
  end
end

defp decode_openai_tool_call(_), do: nil  # Falls through here for Mistral

Mistral API returns:

%{
  "function" => %{"arguments" => "{\"city\": \"Paris\"}", "name" => "get_weather"},
  "id" => "lVauww8VE",
  "index" => 0
  # NO "type" field!
}

OpenAI API returns:

%{
  "function" => %{"arguments" => "{\"city\":\"Paris\"}", "name" => "get_weather"},
  "id" => "call_...",
  "type" => "function"  # HAS "type" field
}

Since Mistral omits the "type" field, the pattern match fails and falls through to the catch-all clause returning nil.

Proposed Fix

Add a fallback pattern that doesn't require the "type" field. This is backwards-compatible since it will only match when the stricter pattern fails:

# Existing pattern for OpenAI and others that include "type"
defp decode_openai_tool_call(%{
       "id" => id,
       "type" => "function",
       "function" => %{"name" => name, "arguments" => args_json}
     }) do
  case Jason.decode(args_json || "{}") do
    {:ok, args} -> ReqLLM.StreamChunk.tool_call(name, args, %{id: id})
    {:error, _} -> nil
  end
end

# Fallback for Mistral and others that omit "type" field
defp decode_openai_tool_call(%{
       "id" => id,
       "function" => %{"name" => name, "arguments" => args_json}
     }) do
  case Jason.decode(args_json || "{}") do
    {:ok, args} -> ReqLLM.StreamChunk.tool_call(name, args, %{id: id})
    {:error, _} -> nil
  end
end

defp decode_openai_tool_call(_), do: nil

Why this belongs in ReqLLM defaults

  1. This is the generic OpenAI-compatible decoder used by multiple providers
  2. Other providers claiming OpenAI compatibility might have the same issue
  3. The fix is backwards-compatible - existing OpenAI responses still work
  4. Putting this in individual provider overrides would require duplicating the entire decode pipeline

I'm happy to submit a PR with this fix if that would be helpful.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions