-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Anthropic without OpenRouter middle man #538
Comments
As a workaround, it appears LiteLLM can be set up locally, as this comment describes. |
I guess I'll close this issue, since #172 covers the concern. |
Happy to keep this open, as direct Claude support is high priority for me at the moment. |
Aider now supports directly connecting to Anthropic and many other LLM providers. https://aider.chat/docs/llms.html#anthropic I'm going to close this issue for now, but feel free to add a comment here and I will re-open or file a new issue any time. |
OpenRouter is a nice solution, but since Claude is so popular, it'd be nice to have direct support within Aider so extra costs can be avoided.
I found https://github.com/jtsang4/claude-to-chatgpt which acts as a client-side adapter for Claude models to be used as if they have an OpenAI-compatible API. There's no Claude 3 support quite yet, but perhaps Aider could be used to add support, or maybe we just use
claude-to-chatgpt
as inspiration for an adapter that lives in Aider's repository.The text was updated successfully, but these errors were encountered: