-
Notifications
You must be signed in to change notification settings - Fork 10.4k
Description
Description
When Ollama is registered as a provider in opencode.json, the SDK's model listing does not include Ollama models. The provider appears in the "connected" list, but the models object remains empty for Ollama.
This causes plugins (e.g., oh-my-opencode) that rely on model discovery to fail when routing tasks to local Ollama models.
What works:
- Ollama provider is recognized (appears in
connectedarray) - Direct model usage works when explicitly specified by a plugin
- Ollama's own API responds correctly (
GET /v1/modelsreturns all models)
What doesn't work:
- Model listing never includes Ollama models
- Plugin caches (
provider-models.json) stay empty for Ollama - Category-based delegation in plugins can't discover Ollama models
Plugins
oh-my-opencode 3.2.3
OpenCode version
1.1.51
Steps to reproduce
-
Install and start Ollama:
ollama serve ollama pull ministral-3:14b-32k-agent
-
Configure Ollama provider in
~/.config/opencode/opencode.json:{ "provider": { "ollama": { "name": "Ollama", "npm": "@ai-sdk/openai-compatible", "options": { "baseURL": "http://localhost:11434/v1" }, "models": { "ministral-3:14b-32k-agent": { "name": "ministral-3:14b-32k-agent" } } } } } -
Start OpenCode and check provider cache:
cat ~/.cache/oh-my-opencode/provider-models.json -
Observe:
"models": {}— Ollama models are missing despite provider being connected. -
Verify Ollama API works directly:
curl http://localhost:11434/v1/models # Returns: { "data": [{ "id": "ministral-3:14b-32k-agent", ... }] }
Screenshot and/or share link
Cache file showing the issue:
{
"models": {},
"connected": ["openai", "google", "anthropic", "opencode", "ollama"],
"updatedAt": "2026-02-05T05:24:52.824Z"
}Note: ollama is in connected but models is empty.
Operating System
Linux
Terminal
SSH
Additional Notes
Current workaround: Manually query ollama list and inject models into the provider cache. This is fragile — gets overwritten on provider refresh.
Suggested approach: When a provider uses npm: "@ai-sdk/openai-compatible" and has a baseURL, query GET {baseURL}/models to discover available models. Ollama (and LM Studio, etc.) all support this OpenAI-compatible endpoint.
Impact: Any plugin relying on model discovery cannot use local models without manual workarounds. This affects the growing ecosystem of users running local models for cost savings and privacy.
Related Issues & Active Development
Awareness note: We've identified significant related work and wanted to cross-reference for visibility.
Canonical Feature Request
- Auto-discover models from OpenAI-compatible provider endpoints #6231 — Auto-discover models from OpenAI-compatible provider endpoints (assigned to @thdxr)
Active Development (models-endpoint feature branch)
- feat(opencode): add auto model detection for OpenAI-compatible providers #8359 [MERGED →
models-endpoint, Jan 30] — Auto model detection for OpenAI-compatible providers - opencode: added logic to probe loaded models from lmstudio, ollama an… #11951 [OPEN →
models-endpoint, Feb 3] — Probe loaded models from LM Studio, Ollama, and llama.cpp- Adds
provider/local/ollama.ts,provider/local/lmstudio.ts,provider/local/llamacpp.ts - Feature branch not yet promoted to
dev
- Adds
Related PRs (direct to dev)
- Adding the auto-detection of ollama local with a variable for baseURL #3726 [OPEN] — Auto-detection of ollama local with baseURL variable
- core: auto-discover models from OpenAI-compatible local providers #7644 [CLOSED] — Auto-discover from local providers (earlier attempt)
- feat(opencode): added logic to probe loaded models from lmstudio and ollama #8748 [OPEN] — Probe loaded models from LM Studio and Ollama
- feat(provider): auto-detect Ollama context limits #10758 [OPEN] — Auto-detect Ollama context limits
Related Issues (same root cause)
- OpenCode displays models which aren't configured & don't exist in LM Studio #4232 — OpenCode displays models which aren't configured & don't exist in LM Studio
- [FEATURE]: Ollama - Multiple Ollama Cloud Models are missing #7873 — Ollama - Multiple Cloud Models are missing
- Model List differ between the CLI and Web. #9581 — Model List differ between the CLI and Web
- The cache of the Provider‘s Models cannot be updatable #7714 — The cache of the Provider's Models cannot be updatable
Downstream Impact (oh-my-opencode plugin)
- [Bug]: Category delegation fails to resolve Ollama models despite correct configuration code-yeongyu/oh-my-opencode#1508 — Category delegation fails due to empty model cache
- fix: handle both string[] and object[] formats in provider-models cache code-yeongyu/oh-my-opencode#1509 — Bridge fix: handle both string[] and object[] cache formats (covers the gap until auto-discovery ships)