Skip to content

Commit bd11aab

Browse files
reidliu41adobrzyn
authored andcommitted
fix: hyperlink (vllm-project#16778)
Signed-off-by: reidliu41 <reid201711@gmail.com> Co-authored-by: reidliu41 <reid201711@gmail.com> Signed-off-by: Agata Dobrzyniewicz <adobrzyniewicz@habana.ai>
1 parent 780ed9e commit bd11aab

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

docs/source/deployment/frameworks/open-webui.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,15 +2,15 @@
22

33
# Open WebUI
44

5-
1. Install the (Docker)[https://docs.docker.com/engine/install/]
5+
1. Install the [Docker](https://docs.docker.com/engine/install/)
66

77
2. Start the vLLM server with the supported chat completion model, e.g.
88

99
```console
1010
vllm serve qwen/Qwen1.5-0.5B-Chat
1111
```
1212

13-
1. Start the (Open WebUI)[https://github.com/open-webui/open-webui] docker container (replace the vllm serve host and vllm serve port):
13+
1. Start the [Open WebUI](https://github.com/open-webui/open-webui) docker container (replace the vllm serve host and vllm serve port):
1414

1515
```console
1616
docker run -d -p 3000:8080 \

0 commit comments

Comments
 (0)