-
Notifications
You must be signed in to change notification settings - Fork 6.6k
feat: add litellmProxy provider option for explicit LiteLLM compatibility #8658
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
- Add dummy _noop tool when message history contains tool calls but no tools provided - Convert pending/running tool calls to error results in toModelMessage - Add hasToolCalls helper and comprehensive tests Co-authored-by: Mark Henderson <Mark.Henderson99@hotmail.com> Fixes anomalyco#8246 Fixes anomalyco#2915
…lity Allow users to explicitly enable LiteLLM proxy compatibility for providers that don't have 'litellm' in their ID (e.g., custom gateways). Config example: provider.mygateway.options.litellmProxy = true This restricts the _noop tool injection to only apply when: 1. Provider ID or API ID contains 'litellm' (auto-detected) 2. Provider has explicit 'litellmProxy: true' option (opt-in)
|
The following comment was made by an LLM, it may be inaccurate: Related PR FoundPR #8497: fix: handle dangling tool_use blocks for LiteLLM proxy compatibility Why it's related:
Note: This is a related follow-up PR, not a duplicate. They work together to provide progressive enhancement of LiteLLM proxy support. |
| input: part.state.input, | ||
| errorText: "[Tool execution was interrupted]", | ||
| callProviderMetadata: part.metadata, | ||
| }) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is 100% a bug elsewhere but we can add this for now, this shouldn't ever happen
|
Can we close ur other pr now? |
…lity (#8658) Co-authored-by: Mark Henderson <Mark.Henderson99@hotmail.com> Co-authored-by: Aiden Cline <63023139+rekram1-node@users.noreply.github.com>
|
We should probably also document this, another idea tho we could update this:
To handle litellm related errors (by doing some string matching) and maybe providing some better feedback for user |
Summary
This PR builds on #8497 and addresses @rekram1-node's feedback from #8248:
Instead of injecting the
_nooptool unconditionally for all providers, this change restricts it to only LiteLLM-backed providers.Problem
#8497 injects the
_nooptool unconditionally when message history contains tool calls. While this fixes the LiteLLM compatibility issue, it unnecessarily affects providers that don't need it (Anthropic native, OpenRouter, Vertex, Bedrock, etc).Solution
Add
litellmProxyprovider option and restrict_noopinjection to LiteLLM providers:Detection methods:
litellmProxy: trueoption (for custom gateways)Config example:
{ "provider": { "my-gateway": { "api": "openai", "options": { "litellmProxy": true } } } }Changes
packages/opencode/src/session/llm.tsisLiteLLMProxycheck with auto-detection + opt-in supportpackages/opencode/src/session/message-v2.tspackages/opencode/test/session/llm.test.tshasToolCallshelper (from #8497)packages/opencode/test/session/message-v2.test.tsTesting
Related
Co-authored-by: Mark Henderson Mark.Henderson99@hotmail.com