You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is a bit of an odd scenario. I have Ollama/codellama running behind the OpenAPI compatible frontend of LiteLLM.
I have confirmed that LiteLLM is responding correctly using Aider.
When point to the local machine on port 8000 (the port LiteLLM is running on) or even the local LAN IP, I get the following error from Librechat in the docker compose logs (see logs section).
Thanks for reporting this! I can fix it, I will have to add a warning that your reverse proxy is incompatible with langchain if it doesn't match the v1 schema of openAI in this case
Contact Details
mikejohnsonit@gmail.com
What happened?
This is a bit of an odd scenario. I have Ollama/codellama running behind the OpenAPI compatible frontend of LiteLLM.
I have confirmed that LiteLLM is responding correctly using Aider.
When point to the local machine on port 8000 (the port LiteLLM is running on) or even the local LAN IP, I get the following error from Librechat in the docker compose logs (see logs section).
Any ideas?
Steps to Reproduce
gpt-3.5-turbo
modelWhat browsers are you seeing the problem on?
No response
Relevant log output
Screenshots
Code of Conduct
The text was updated successfully, but these errors were encountered: