-
Notifications
You must be signed in to change notification settings - Fork 11.4k
Add a webui to this #1479
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
oobabooga/text-generation-webui already has support for llama.cpp – if you have trouble making it work with this repo, you should probably ask there. |
I think llama.cpp itself should not have a GUI. Instead it should be packaged as a library that external applications can then in turn use for inference. |
See the new code in examples/server – web UI included! |
That's great that a Chat UI was added. Please add a Text Continuation UI also. |
closing as completed. |
This repo was the only way I find to successfully run a vicuna model (ggml-vic13b-uncensored-q8_0.bin) but I miss a web ui.
Can you suggest how to add one ?
How can I make text-generation-webui use this llama.cpp ?
The text was updated successfully, but these errors were encountered: