-
Notifications
You must be signed in to change notification settings - Fork 529
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Litellm/01 is unable to connect to non-openAI providers. #272
Comments
This is also the question I want to ask. It turns out that the command line is written like this. Do I need to install the litellm service first and start it to obtain the local connection interface? |
Since OpenInterpreter uses litellm, I think you need to specify this differently. Here is what I think would work: 'poetry run 01 model "groq/gemma-7b-it --tts-service piper --stt-service local-whisper'." Litellm already pulls all the data automatically if you specify the provider in the model. Or at least it should do that. Here are some instructions on how to get it to work with open router: https://discordapp.com/channels/1146610656779440188/1194880263122075688/1240334434352365569 |
Well as you know in the discord community some people seemed to suggest 01 is automatically appending "openai/" before the model names specified in the arguments. So for instance you might end up with "openai/groq/gemma-7b-it". Is that what's causing the issue?
If it does, why the need to specify all the details when people use the open interpreter directly? |
Does the project side care about it? No developer has responded to questions for so many days? |
If you want to get the 01 to work with open router ( and others? ), you can try this: It's still super unintuitive and I think maybe should be made more intuitive. But you can make it work. The openai key is for whisper and TTS. If you use a local model you can leave this out. I also forgot the "poetry install" before "poetry run" ? Different model name would be "openrouter/meta-llama/llama-3-70b" |
What if I haven't openai key and also local model,how can I use whisper and TTS?Can I only use openrouter apikey for all fucntion? |
Openrouter does not have whisper. There is a rewrite on the way that implements more options for TTS and STT. https://github.com/KillianLucas/01-rewrite |
this is how i ran 01 with groq and local tts/stt/
|
Can you tell which of the line we need to change?? |
@aj47 just making sure, that the api key is fake or revoked ;) |
yep all g, revoked before posting |
Thank you Sir, this it works for me, really, really appreciated |
What causes the issue:
Run 01 specifying any non OAI server-host and api key
Expected:
Be able to connect to other services like Groq, Anthropic, OpenRouter etc as the seem to be working with the base Open Intepreter
Screenshots:
Using:
Feedback
After many attempts using different settings, it seems either 01 is not passing the right arguments to litellm, or litellm isn't yet correctly configured for other providers for 01
The text was updated successfully, but these errors were encountered: