Skip to content

we want to Run the Wren AI with the llama3 model so how it is possible ? #336

Answered by cyyeh
hbanand asked this question in Q&A
Discussion options

You must be logged in to vote

@hbanand Thanks for reaching out! Please check out this for further details: #277 (comment)

In brief, there might be issues if you use Ollama's models, since LLM output might not always conform to JSON output. I've already added Ollama support in this branch: https://github.com/Canner/WrenAI/tree/feature/ai-service/add-ollama. Welcome to check it out! Also I think we should customize prompts depending on the model we choose. There is still some space for improvement. I would like to hear your thoughts anyway.

For JSON output broken issue, I am thinking to add a component to help solve this issue: https://github.com/noamgat/lm-format-enforcer. I would also like to know more your thoughts h…

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@cyyeh
Comment options

Answer selected by cyyeh
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants