Skip to content

Add a webui to this #1479

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
suoko opened this issue May 16, 2023 · 6 comments
Closed

Add a webui to this #1479

suoko opened this issue May 16, 2023 · 6 comments

Comments

@suoko
Copy link

suoko commented May 16, 2023

This repo was the only way I find to successfully run a vicuna model (ggml-vic13b-uncensored-q8_0.bin) but I miss a web ui.
Can you suggest how to add one ?
How can I make text-generation-webui use this llama.cpp ?

@akx
Copy link
Contributor

akx commented May 16, 2023

oobabooga/text-generation-webui already has support for llama.cpp – if you have trouble making it work with this repo, you should probably ask there.

@JohannesGaessler
Copy link
Collaborator

I think llama.cpp itself should not have a GUI. Instead it should be packaged as a library that external applications can then in turn use for inference.

@evanmiller
Copy link
Contributor

See the new code in examples/server – web UI included!

@rain-1
Copy link

rain-1 commented Jul 7, 2023

That's great that a Chat UI was added. Please add a Text Continuation UI also.

@rain-1
Copy link

rain-1 commented Jul 7, 2023

@Green-Sky
Copy link
Collaborator

closing as completed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants