Skip to content

docs: add Ollama streaming NDJSON issue guide and workaround#1197

Merged
code-yeongyu merged 3 commits intocode-yeongyu:devfrom
rooftop-Owl:fix/ollama-streaming-ndjson-parsing
Jan 28, 2026
Merged

docs: add Ollama streaming NDJSON issue guide and workaround#1197
code-yeongyu merged 3 commits intocode-yeongyu:devfrom
rooftop-Owl:fix/ollama-streaming-ndjson-parsing

Conversation

@rooftop-Owl
Copy link
Contributor

@rooftop-Owl rooftop-Owl commented Jan 28, 2026

Summary

Resolves #1124 - Ollama integration fails with streaming NDJSON responses

This PR provides documentation and utility code for the Ollama streaming issue.

Problem

When using Ollama with oh-my-opencode agents that use tools, the error occurs:

JSON Parse error: Unexpected EOF

Root Cause: Ollama returns NDJSON when stream: true, but SDK expects single JSON.

Solution

  1. Troubleshooting Guide - Explains issue and provides immediate workaround
  2. Configuration Documentation - Documents stream: false setting
  3. NDJSON Parser Utility - For future SDK integration

Verification

  • ✅ Tested with real Ollama API (ministral-3:14b, qwen3-coder)
  • ✅ NDJSON parser tested with real streaming data (9 lines)
  • ✅ Tool calls verified working with stream=false
  • ✅ No JSON parse errors

Files Changed

  • docs/troubleshooting/ollama-streaming-issue.md (NEW - 3.4 KB)
  • docs/configurations.md (MODIFIED - +60 lines)
  • src/shared/ollama-ndjson-parser.ts (NEW - 4.9 KB)

Checklist


Summary by cubic

Documents and works around Ollama’s streaming NDJSON responses that caused “JSON Parse error: Unexpected EOF” during tool calls. Adds an NDJSON parser utility and documents setting stream: false to restore stability (fixes #1124).

  • New Features

    • NDJSON parser utility for Ollama streaming (parseOllamaStreamResponse, isNDJSONResponse).
    • Merges tool_calls and content, skips malformed lines, and collects basic stats.
  • Migration

    • For Ollama agents, set "stream": false in config.

Written for commit c177f56. Summary will update on new commits.

- Document problem: JSON Parse error when using Ollama with stream: true
- Explain root cause: NDJSON vs single JSON object mismatch
- Provide 3 solutions: disable streaming, avoid tool agents, wait for SDK fix
- Include NDJSON parsing code example for SDK maintainers
- Add curl testing command for verification
- Link to issue code-yeongyu#1124 and Ollama API docs

Fixes code-yeongyu#1124
- Add Ollama Provider section to configurations.md
- Document stream: false requirement for Ollama
- Explain NDJSON vs single JSON mismatch
- Provide supported models table (qwen3-coder, ministral-3, lfm2.5-thinking)
- Add troubleshooting steps and curl test command
- Link to troubleshooting guide

feat: add NDJSON parser utility for Ollama streaming responses

- Create src/shared/ollama-ndjson-parser.ts
- Implement parseOllamaStreamResponse() for merging NDJSON lines
- Implement isNDJSONResponse() for format detection
- Add TypeScript interfaces for Ollama message structures
- Include JSDoc with usage examples
- Handle edge cases: malformed lines, stats aggregation

This utility can be contributed to Claude Code SDK for proper NDJSON support.

Related to code-yeongyu#1124
- Replace console.warn with log() from shared/logger
- Remove trailing whitespace from troubleshooting guide
- Ensure TypeScript compatibility
@github-actions
Copy link
Contributor

github-actions bot commented Jan 28, 2026

All contributors have signed the CLA. Thank you! ✅
Posted by the CLA Assistant Lite bot.

Copy link

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No issues found across 3 files

Confidence score: 5/5

  • Automated review surfaced no issues in the provided summaries.
  • No files require special attention.

@rooftop-Owl
Copy link
Contributor Author

I have read the CLA Document and I hereby sign the CLA

Copy link
Owner

@code-yeongyu code-yeongyu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! This is a well-documented fix for the Ollama streaming NDJSON issue.

Review summary:

  • Troubleshooting guide is clear and actionable
  • NDJSON parser utility is well-tested with real Ollama data
  • Configuration documentation properly explains the stream: false workaround
  • cubic review passed with no issues

The immediate workaround (stream: false) helps users now, and the parser utility sets up future SDK-level integration.

Approving for merge.

@code-yeongyu code-yeongyu merged commit 895f366 into code-yeongyu:dev Jan 28, 2026
3 of 4 checks passed
@rooftop-Owl
Copy link
Contributor Author

Thank you so much for the quick review and merge! 🎉

I'm incredibly grateful for your work on oh-my-opencode. As a graduate student in Korea, this project has been transformative in helping me orchestrate AI models while working within limited subscription budgets.

The Ollama streaming issue (#1124) was blocking my hybrid local/cloud workflow, where I route high-volume tasks to local GPU (Ollama) to preserve cloud tokens for quality-critical work. This fix enables 60-80% cost savings for budget-conscious users like myself.

I hope this contribution helps other researchers and students who are exploring cost-effective AI orchestration with local models.

Thank you for building and maintaining such an incredible tool that democratizes access to advanced AI orchestration! 🙏

P.S. I owe you more than a cup of coffee if we happen to meet in the real world (I also live in Korea 🇰🇷)

@rooftop-Owl
Copy link
Contributor Author

FYI: Filed a follow-up issue #1508 regarding category delegation model resolution for Ollama.

Your PR #1197 successfully fixed the NDJSON parsing issue - direct agent routing now works perfectly. However, category-based delegation (delegate_task(category='quick')) still fails due to a separate model resolution issue in executor.ts.

Issue: #1508

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: Ollama integration fails - streaming NDJSON response not parsed correctly

2 participants

Comments