-
Notifications
You must be signed in to change notification settings - Fork 11.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add instructions for using Alpaca #240
Comments
@ggerganov This would be a great addition - and maybe extended to wasm so the model can run on a browser if possible? Big question though - the LLAMA weights themselves seem under restricted license, same with Alapaca. |
Just to chime in. I've had luck running the Alpaca model provided from that repository with this repository's code as is as a replacement for the 7B LlaMa model. Seems to work fair enough but I don't see it hurting to adding more info and functionality here. |
It appears the 13B Alpaca model provided from the alpaca.cpp repository cannot be loaded with llama.ccp however. I assume it expects the model to be in two parts.
|
yes they are hardcoded right now. see thier patch antimatter15@97d327e |
We can add a temporary cmd arg |
Ah that does the trick, loaded the weights up fine with that change. A temporary argument for loading them also sounds cool. |
I think this model module count needs a more permanent solution. Either using some kind of auto-discover or a value written inside ggml file. |
Vicunas here we come. 😉 |
Also start adding prompts in "./prompts"
See the work here: https://github.com/antimatter15/alpaca.cpp
There is no new functionality added, just a few hardcoded parameters in
chat.cpp
.Instead of adding separate
chat
program, we should have analpaca.py
script that runsmain
with the respective parameters, so the user can simply run./alpaca.py
on the terminal.It is a good time to start collecting prompts, so create a few useful Alpaca instruction prompts and place them in a
prompts
folder in the source tree. Make thealpaca.py
script use one of them by default. Add option to change.Add short instructions for using the
alpaca.py
for various tasks (translation, answering, .. whatever is popular) in the README and reference thealpaca.cpp
repo for downloading the models.The text was updated successfully, but these errors were encountered: