Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to use codellama instead of openai api? #154

Open
ouvaa opened this issue Feb 5, 2024 · 3 comments
Open

how to use codellama instead of openai api? #154

ouvaa opened this issue Feb 5, 2024 · 3 comments

Comments

@ouvaa
Copy link

ouvaa commented Feb 5, 2024

how to use codellama instead of openai api?
how to make it use llama.cpp?

@uniAIDevs
Copy link

I'm working on a solution to integrate the HuggingFace inference API into the program! I'll consider adding an additional feature for this. This may take some time, but I use other developer agents to speed up the small tasks. There will be a alternate version created by me soon!

@bvelker
Copy link

bvelker commented Apr 1, 2024

It might be more efficient to integrate LiteLLM https://github.com/BerriAI/litellm
It already has access to basically all LLM api's, including Huggingface and Openrouter

@timiil
Copy link

timiil commented May 10, 2024

It might be more efficient to integrate LiteLLM https://github.com/BerriAI/litellm It already has access to basically all LLM api's, including Huggingface and Openrouter

any example that we can use LiteLLM inside this project

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants