forked from kaust-generative-ai/local-deployment
-
Notifications
You must be signed in to change notification settings - Fork 0
Issues: kaust-generative-ai/local-deployment-llama-cpp
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Add reverse prompting example
enhancement
New feature or request
#17
opened Oct 16, 2024 by
davidrpugh
Usage examples for New feature or request
--logit-bias
enhancement
#13
opened Oct 16, 2024 by
davidrpugh
Modify build scripts to enable curl support
enhancement
New feature or request
#11
opened Oct 15, 2024 by
davidrpugh
Configure the cache directory used by LLaMA C++
enhancement
New feature or request
#10
opened Oct 15, 2024 by
davidrpugh
LLaMA C++ integration with HF
enhancement
New feature or request
#9
opened Oct 7, 2024 by
davidrpugh
ProTip!
What’s not been updated in a month: updated:<2024-09-21.