-
Notifications
You must be signed in to change notification settings - Fork 7.7k
Description
Any model
Right now, opencode curates the models available through models.dev. This is good if you don't know what to use, but it means that you're missing out on dozens of models. Ideally opencode uses the models endpoint (https://api.openai.com/v1/models, https://openrouter.ai/api/v1/models) to load and display all available models, but if not possible, allow manually typing in the model specifier in the /models command.
*fun fact: using the models endpoint makes sure you have the most up to date and accurate data, preventing issues like #926
Any provider
Right now, opencode also curates the providers (to Anthropic, GitHub Copilot, OpenAI, Google, Amazon Bedrock, Azure, DeepSeek, Groq, Llama, Mistral, Morph, OpenRouter, Vercel, Vertex, and xAI). It's a long and admirable list, but it'll never be complete, since new providers and local providers will keep springing up. It should be simple to add a provider: type in a name, OpenAI-compatible API endpoint, and API key, and start using opencode. There is an "other" flow currently, but it requires you to manually edit the JSON while referring to the docs, which takes us to...
Document it
You've just been told to "check the docs for examples" (of how to add an OpenAI-compatible provider) by opencode. You find the config documentation, ctrl+f for "provider", and...
this is the closest thing to documentation on providers. The config documentation has many sections - modes, models, themes, logging, keybinds, MCP servers, and disabled providers - but how to actually configure a provider is not one of them.
Now, if you're Dax, you're probably screaming at me right now. I'm so stupid that I can't follow a clearly marked link to "learn more" about models. And in fact, if you follow that link and specifically look at the "local models" section (although your provider isn't local), you'll find an example!

But this is too hard to find. As long as you're required to edit config to add custom OpenAI-compatible providers, the config page should have a section on providers that clearly indicates how to set up OpenAI-compatible providers.
Why didn't I just make opencode make a PR to solve this
Unfortunately my preferred models don't work - see #929 and #926.
