Skip to content
This repository has been archived by the owner on May 18, 2024. It is now read-only.

Local Hosted Models (ollama) #265

Answered by mishl-dev
Dcamy asked this question in Q&A
Discussion options

You must be logged in to vote

Change base url in config.yml

API_BASE_URL: https://localhost:11434/openai/v1/

and also change the model id to your finetuned ollama model

MODEL_ID: your-custom-ollama-model-id

You can check your ollama models here https://localhost:11434//api/tags

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by mishl-dev
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants