Skip to content

Commit 85f9b43

Browse files
authored
Update README.md
1 parent 358c6bb commit 85f9b43

File tree

1 file changed

+12
-0
lines changed
  • ChatQnA/docker_compose/intel/cpu/xeon

1 file changed

+12
-0
lines changed

ChatQnA/docker_compose/intel/cpu/xeon/README.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -44,6 +44,14 @@ To set up environment variables for deploying ChatQnA services, follow these ste
4444
docker compose up -d
4545
```
4646

47+
To enable Open Telemetry Tracing, compose_telemetry.yaml file need to be merged along with default compose.yaml file.
48+
CPU example with Open Telemetry feature:
49+
50+
```bash
51+
cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
52+
docker compose -f compose.yaml -f compose_telemetry.yaml up -d
53+
```
54+
4755
It will automatically download the docker image on `docker hub`:
4856

4957
```bash
@@ -263,12 +271,16 @@ If use vLLM as the LLM serving backend.
263271
docker compose -f compose.yaml up -d
264272
# Start ChatQnA without Rerank Pipeline
265273
docker compose -f compose_without_rerank.yaml up -d
274+
# Start ChatQnA with Rerank Pipeline and Open Telemetry Tracing
275+
docker compose -f compose.yaml -f compose_telemetry.yaml up -d
266276
```
267277

268278
If use TGI as the LLM serving backend.
269279

270280
```bash
271281
docker compose -f compose_tgi.yaml up -d
282+
# Start ChatQnA with Rerank Pipeline and Open Telemetry Tracing
283+
docker compose -f compose_tgi.yaml -f compose_tgi_telemetry.yaml up -d
272284
```
273285

274286
### Validate Microservices

0 commit comments

Comments
 (0)