Skip to content

Commit 4aa35c2

Browse files
authored
Update README.md
1 parent 85f9b43 commit 4aa35c2

File tree

1 file changed

+10
-0
lines changed
  • ChatQnA/docker_compose/intel/hpu/gaudi

1 file changed

+10
-0
lines changed

ChatQnA/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -45,6 +45,12 @@ To set up environment variables for deploying ChatQnA services, follow these ste
4545
docker compose up -d
4646
```
4747

48+
To enable Open Telemetry Tracing, compose_telemetry.yaml file need to be merged along with default compose.yaml file.
49+
50+
```bash
51+
docker compose -f compose.yaml -f compose_telemetry.yaml up -d
52+
```
53+
4854
It will automatically download the docker image on `docker hub`:
4955

5056
```bash
@@ -259,12 +265,16 @@ If use vLLM as the LLM serving backend.
259265
docker compose -f compose.yaml up -d
260266
# Start ChatQnA without Rerank Pipeline
261267
docker compose -f compose_without_rerank.yaml up -d
268+
# Start ChatQnA with Rerank Pipeline and Open Telemetry Tracing
269+
docker compose -f compose.yaml -f compose_telemetry.yaml up -d
262270
```
263271

264272
If use TGI as the LLM serving backend.
265273

266274
```bash
267275
docker compose -f compose_tgi.yaml up -d
276+
# Start ChatQnA with Open Telemetry Tracing
277+
docker compose -f compose_tgi.yaml -f compose_tgi_telemetry.yaml up -d
268278
```
269279

270280
If you want to enable guardrails microservice in the pipeline, please follow the below command instead:

0 commit comments

Comments
 (0)