You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Right now you can run custom LLMs using Ollama, but chatd expects that the model has already been downloaded before Ollama is connected to chatd. chatd should be able to handle downloading new models directly, without using the Ollama CLI.
Workaround: run ollama pull <model name> in a terminal and download the model before switching it in chatd for the time being.
The text was updated successfully, but these errors were encountered:
This is spun out from #1
Right now you can run custom LLMs using Ollama, but chatd expects that the model has already been downloaded before Ollama is connected to chatd. chatd should be able to handle downloading new models directly, without using the Ollama CLI.
Workaround: run
ollama pull <model name>
in a terminal and download the model before switching it in chatd for the time being.The text was updated successfully, but these errors were encountered: