Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HuggingFaceH4/starchat-alpha CPP LLM #1441

Closed
ekolawole opened this issue May 14, 2023 · 3 comments
Closed

HuggingFaceH4/starchat-alpha CPP LLM #1441

ekolawole opened this issue May 14, 2023 · 3 comments
Labels

Comments

@ekolawole
Copy link

What we need is a local cpp for llm "HuggingFaceH4/starchat-alpha A". It is much better than any other open source chat models at coding. And it does very well at chatting too. This is better than Vicuna.

@chat-guy
Copy link

This would be great. It is indeed significantly better than Vicuna as far as code-generation is concerned.

@s-kostyaev
Copy link

Looks like we need to integrate https://github.com/ggerganov/ggml/tree/master/examples/starcoder into llama.cpp

Copy link
Contributor

github-actions bot commented Apr 9, 2024

This issue was closed because it has been inactive for 14 days since being marked as stale.

@github-actions github-actions bot closed this as completed Apr 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants