File tree Expand file tree Collapse file tree 3 files changed +30
-0
lines changed Expand file tree Collapse file tree 3 files changed +30
-0
lines changed Original file line number Diff line number Diff line change 99helm
1010lws
1111modal
12+ open-webui
1213skypilot
1314triton
1415:::
Original file line number Diff line number Diff line change 1+ (deployment-open-webui)=
2+
3+ # Open WebUI
4+
5+ 1 . Install the (Docker)[ https://docs.docker.com/engine/install/ ]
6+
7+ 2 . Start the vLLM server with the supported chat completion model, e.g.
8+
9+ ``` console
10+ vllm serve qwen/Qwen1.5-0.5B-Chat
11+ ```
12+
13+ 1 . Start the (Open WebUI)[ https://github.com/open-webui/open-webui ] docker container (replace the vllm serve host and vllm serve port):
14+
15+ ``` console
16+ docker run -d -p 3000:8080 \
17+ --name open-webui \
18+ -v open-webui:/app/backend/data \
19+ -e OPENAI_API_BASE_URL=http://<vllm serve host>:<vllm serve port>/v1 \
20+ --restart always \
21+ ghcr.io/open-webui/open-webui:main
22+ ```
23+
24+ 1 . Open it in the browser: < http://open-webui-host:3000/ >
25+
26+ On the top of the web page, you can see the model ` qwen/Qwen1.5-0.5B-Chat ` .
27+
28+ :::{image} /assets/deployment/open_webui.png
29+ :::
You can’t perform that action at this time.
0 commit comments