docs: add Ollama streaming NDJSON issue guide and workaround#1197
Conversation
- Document problem: JSON Parse error when using Ollama with stream: true - Explain root cause: NDJSON vs single JSON object mismatch - Provide 3 solutions: disable streaming, avoid tool agents, wait for SDK fix - Include NDJSON parsing code example for SDK maintainers - Add curl testing command for verification - Link to issue code-yeongyu#1124 and Ollama API docs Fixes code-yeongyu#1124
- Add Ollama Provider section to configurations.md - Document stream: false requirement for Ollama - Explain NDJSON vs single JSON mismatch - Provide supported models table (qwen3-coder, ministral-3, lfm2.5-thinking) - Add troubleshooting steps and curl test command - Link to troubleshooting guide feat: add NDJSON parser utility for Ollama streaming responses - Create src/shared/ollama-ndjson-parser.ts - Implement parseOllamaStreamResponse() for merging NDJSON lines - Implement isNDJSONResponse() for format detection - Add TypeScript interfaces for Ollama message structures - Include JSDoc with usage examples - Handle edge cases: malformed lines, stats aggregation This utility can be contributed to Claude Code SDK for proper NDJSON support. Related to code-yeongyu#1124
- Replace console.warn with log() from shared/logger - Remove trailing whitespace from troubleshooting guide - Ensure TypeScript compatibility
|
All contributors have signed the CLA. Thank you! ✅ |
|
I have read the CLA Document and I hereby sign the CLA |
code-yeongyu
left a comment
There was a problem hiding this comment.
LGTM! This is a well-documented fix for the Ollama streaming NDJSON issue.
Review summary:
- Troubleshooting guide is clear and actionable
- NDJSON parser utility is well-tested with real Ollama data
- Configuration documentation properly explains the
stream: falseworkaround - cubic review passed with no issues
The immediate workaround (stream: false) helps users now, and the parser utility sets up future SDK-level integration.
Approving for merge.
|
Thank you so much for the quick review and merge! 🎉 I'm incredibly grateful for your work on oh-my-opencode. As a graduate student in Korea, this project has been transformative in helping me orchestrate AI models while working within limited subscription budgets. The Ollama streaming issue (#1124) was blocking my hybrid local/cloud workflow, where I route high-volume tasks to local GPU (Ollama) to preserve cloud tokens for quality-critical work. This fix enables 60-80% cost savings for budget-conscious users like myself. I hope this contribution helps other researchers and students who are exploring cost-effective AI orchestration with local models. Thank you for building and maintaining such an incredible tool that democratizes access to advanced AI orchestration! 🙏 P.S. I owe you more than a cup of coffee if we happen to meet in the real world (I also live in Korea 🇰🇷) |
|
FYI: Filed a follow-up issue #1508 regarding category delegation model resolution for Ollama. Your PR #1197 successfully fixed the NDJSON parsing issue - direct agent routing now works perfectly. However, category-based delegation ( Issue: #1508 |
Summary
Resolves #1124 - Ollama integration fails with streaming NDJSON responses
This PR provides documentation and utility code for the Ollama streaming issue.
Problem
When using Ollama with oh-my-opencode agents that use tools, the error occurs:
Root Cause: Ollama returns NDJSON when
stream: true, but SDK expects single JSON.Solution
stream: falsesettingVerification
Files Changed
docs/troubleshooting/ollama-streaming-issue.md(NEW - 3.4 KB)docs/configurations.md(MODIFIED - +60 lines)src/shared/ollama-ndjson-parser.ts(NEW - 4.9 KB)Checklist
bun run typecheckpassesbun run buildsucceedsSummary by cubic
Documents and works around Ollama’s streaming NDJSON responses that caused “JSON Parse error: Unexpected EOF” during tool calls. Adds an NDJSON parser utility and documents setting stream: false to restore stability (fixes #1124).
New Features
Migration
Written for commit c177f56. Summary will update on new commits.