Replies: 1 comment
-
It is possible to run other LLMs locally using LocalAI or other OpenAI compatible local backends. In the config file we can point |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
surely we can get a local method of running this with https://github.com/jmorganca/ollama
Beta Was this translation helpful? Give feedback.
All reactions