[Question]: No Clue Why This Service Stopped Working #4786
Unanswered
georgije-conv
asked this question in
Troubleshooting
Replies: 1 comment 1 reply
-
From the logs, it doesn’t look like there’s any notable errors. If the model fetching fails, there should still be no issues. Double-check the |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
What is your question?
I have been using the service for a few days now and have been loving it. Out of nowhere, it just stopped working. I have no idea whats going on under the hood, but it looks like it's an API connection thing? I can't figure out why it doesn't work, as the basic API calls to OpenAI from python are working just fine with my API key. It's not a connection issue, I can access the localhost just fine, but I can no longer get any model responses.
I disabled all models and providers except for Open AI and GPT-4o-mini and it is still having problems. There is a log file below. I deleted all containers and images, recloned the repository, etc., but the issue remains.
Does it have something to do with the librechat.yml file not being configured? I don't recall ever configuring this in the past.
More Details
What is the main subject of your question?
No response
Screenshots
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions