Skip to content

Local Ollama tool calling either not calling or failing outright #1034

@davidvfx07

Description

@davidvfx07

Hi, I'm trying a local Ollama model that supports tool calls (qwen3:32b). When I ask for example "What networks am I connected to?" the model just thinks about which tool to use to fulfill the request, but never does anything. Occasionally it will do something, as I've seen with prompts like "What's in my .config files?", and its seemingly correctly generating the tool-call JSON in the ballpark of {"Bash": "LS", ...} or something, but the command never actually executes anything and is just printed to the chat. I'd assume something isn't configured right or maybe the JSON tool call is not formatted correctly after all. I've confirmed it works with Github Copilot GPT-4o, and all tools were enabled in my settings for both Copilot and Ollama. What is the expected workflow to get opencode to actually run the command after the model suggests it?

(I simplified my config for space here to whats important, but nothing else is different except just some other Ollama models I added.)

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "ollama": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "Ollama (local)",
      "options": {
        "baseURL": "http://127.0.0.1:11434/v1"
      },
        "qwen3:32b": {
          "name": "Qwen 3 32B (local)"
      }
    }
  }
}

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions