Skip to content

[Bug]: Ollama integration fails - streaming NDJSON response not parsed correctly #1124

@rooftop-Owl

Description

@rooftop-Owl

Prerequisites

  • I have searched existing issues to avoid duplicates
  • I am using the latest version of oh-my-opencode
  • I have read the documentation

Bug Description

Ollama-based agents fail with JSON Parse error: Unexpected EOF when attempting tool calls. The root cause is that oh-my-opencode uses stream: true when calling Ollama API, but Ollama returns NDJSON (newline-delimited JSON) for streaming responses which cannot be parsed as a single JSON object.

Steps to Reproduce

  1. Configure oh-my-opencode to use Ollama provider with a local model (e.g., qwen3-coder, ministral-3:14b)
  2. Invoke any agent that uses tools (e.g., explore agent)
  3. Agent attempts a tool call
  4. Error occurs during response parsing

Direct reproduction with curl:

# This works (stream: false)
curl -s http://localhost:11434/api/chat -d '{"model": "qwen3-coder", "messages": [{"role": "user", "content": "Read file README.md"}], "stream": false, "tools": [{"type": "function", "function": {"name": "read", "description": "Read a file", "parameters": {"type": "object", "properties": {"filePath": {"type": "string"}}, "required": ["filePath"]}}}]}'

# This fails when parsed as single JSON (stream: true returns NDJSON)
curl -s http://localhost:11434/api/chat -d '{"model": "qwen3-coder", "messages": [{"role": "user", "content": "Read file README.md"}], "stream": true, "tools": [...]}'

Expected Behavior

Ollama agents should work the same as cloud agents - tool calls should be parsed correctly regardless of streaming mode.

Actual Behavior

Agent invocation fails with:

JSON Parse error: Unexpected EOF

The streaming response returns multiple NDJSON lines:

{"message":{"tool_calls":[{"function":{"name":"read","arguments":{"filePath":"README.md"}}}]}, "done":false}
{"message":{"content":""}, "done":true}

This cannot be parsed as a single JSON object.

Doctor Output

# Unable to run doctor - this is a development/testing environment
# The issue is reproducible with direct Ollama API calls

Error Logs

JSON Parse error: Unexpected EOF

Configuration

{
  "provider": "ollama",
  "model": "qwen3-coder"
}

Additional Context

Suggested fixes:

  1. Simple fix: Use stream: false for Ollama provider
  2. Better fix: Parse NDJSON properly by splitting on newlines and merging tool_calls

Related:

Operating System

Linux (Ubuntu 22.04)

OpenCode Version

Latest (tested with oh-my-opencode plugin)

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions