We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 780ed9e commit bd11aabCopy full SHA for bd11aab
docs/source/deployment/frameworks/open-webui.md
@@ -2,15 +2,15 @@
2
3
# Open WebUI
4
5
-1. Install the (Docker)[https://docs.docker.com/engine/install/]
+1. Install the [Docker](https://docs.docker.com/engine/install/)
6
7
2. Start the vLLM server with the supported chat completion model, e.g.
8
9
```console
10
vllm serve qwen/Qwen1.5-0.5B-Chat
11
```
12
13
-1. Start the (Open WebUI)[https://github.com/open-webui/open-webui] docker container (replace the vllm serve host and vllm serve port):
+1. Start the [Open WebUI](https://github.com/open-webui/open-webui) docker container (replace the vllm serve host and vllm serve port):
14
15
16
docker run -d -p 3000:8080 \
0 commit comments