-
Has anyone been able to get this to work with a local host model? I a have a model I fine tuned for our community and need a way to make it easily accessible so I am thinking a Discord server but I have never messed with Discord bots (or coding honestly. not my department but I wanna have a working example when I pitch it to the devs). Any info or pointers would be super helpful, thanks |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Change base url in config.yml API_BASE_URL: https://localhost:11434/openai/v1/ and also change the model id to your finetuned ollama model MODEL_ID: your-custom-ollama-model-id You can check your ollama models here https://localhost:11434//api/tags |
Beta Was this translation helpful? Give feedback.
Change base url in config.yml
and also change the model id to your finetuned ollama model
You can check your ollama models here https://localhost:11434//api/tags