File tree Expand file tree Collapse file tree 1 file changed +25
-0
lines changed
examples/online_serving/open_webui Expand file tree Collapse file tree 1 file changed +25
-0
lines changed Original file line number Diff line number Diff line change 1+ # Setup Open WebUI with vLLM
2+
3+ 1 . Install the (Docker)[ https://docs.docker.com/engine/install/ ]
4+
5+ 2 . Start the vLLM server with supported chat completion, e.g.
6+
7+ ``` console
8+ vllm serve qwen/Qwen1.5-0.5B-Chat
9+ ```
10+
11+ 1 . Start the (Open WebUI)[ https://github.com/open-webui/open-webui ] docker container (replace the vllm serve host and vllm serve port):
12+
13+ ``` console
14+ docker run -d -p 3000:8080 \
15+ --name open-webui \
16+ -v open-webui:/app/backend/data \
17+ -e OPENAI_API_BASE_URL=http://<vllm serve host>:<vllm serve port>/v1 \
18+ --restart always \
19+ ghcr.io/open-webui/open-webui:main
20+ ```
21+
22+ 1 . Open it in the browser: < http://open-webui-host:3000/ >
23+
24+ On the top of the web page, you can see the model ` qwen/Qwen1.5-0.5B-Chat ` .
25+ ![ Spans details] ( https://imgur.com/a/pm1VRqG )
You can’t perform that action at this time.
0 commit comments