Skip to content

How can I use my custom Ollama model in this project? #41

Answered by cediorio
vatche-t asked this question in Q&A
Discussion options

You must be logged in to vote

I was able to use ollama easily. Here's the relevant code:

import instructor
from openai import OpenAI

# ...

client = instructor.from_openai(
    OpenAI(
        base_url="http://localhost:11434/v1",
        api_key="ollama", # not actually used...
    ),
    mode=instructor.Mode.JSON
)

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by KennyVaneetvelde
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants