Skip to content

Commit 358c6bb

Browse files
authored
Update README.md
1 parent dfe987d commit 358c6bb

File tree

1 file changed

+19
-1
lines changed

1 file changed

+19
-1
lines changed

ChatQnA/README.md

Lines changed: 19 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -90,6 +90,13 @@ cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
9090
# cd GenAIExamples/ChatQnA/docker_compose/nvidia/gpu/
9191
docker compose up -d
9292
```
93+
To enable Open Telemetry Tracing, compose_telemetry.yaml file need to be merged along with default compose.yaml file.
94+
CPU example with Open Telemetry feature:
95+
96+
```bash
97+
cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
98+
docker compose -f compose.yaml -f compose_telemetry.yaml up -d
99+
```
93100

94101
It will automatically download the docker image on `docker hub`:
95102

@@ -232,6 +239,12 @@ cd GenAIExamples/ChatQnA/docker_compose/intel/hpu/gaudi/
232239
docker compose up -d
233240
```
234241

242+
To enable Open Telemetry Tracing, compose_telemetry.yaml file need to be merged along with default compose.yaml file.
243+
```bash
244+
cd GenAIExamples/ChatQnA/docker_compose/intel/hpu/gaudi/
245+
docker compose -f compose.yaml -f compose_telemetry.yaml up -d
246+
```
247+
235248
Refer to the [Gaudi Guide](./docker_compose/intel/hpu/gaudi/README.md) to build docker images from source.
236249

237250
### Deploy ChatQnA on Xeon
@@ -242,6 +255,11 @@ Find the corresponding [compose.yaml](./docker_compose/intel/cpu/xeon/compose.ya
242255
cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
243256
docker compose up -d
244257
```
258+
To enable Open Telemetry Tracing, compose_telemetry.yaml file need to be merged along with default compose.yaml file.
259+
```bash
260+
cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
261+
docker compose -f compose.yaml -f compose_telemetry.yaml up -d
262+
```
245263

246264
Refer to the [Xeon Guide](./docker_compose/intel/cpu/xeon/README.md) for more instructions on building docker images from source.
247265

@@ -346,7 +364,7 @@ OPEA microservice deployment can easily be monitored through Grafana dashboards
346364

347365
## Tracing Services with OpenTelemetry Tracing and Jaeger
348366

349-
> NOTE: limited support. Only LLM inference serving with TGI on Gaudi is enabled for this feature.
367+
> NOTE: This feature is disabled by default. Please check the Deploy ChatQnA sessions for how to enable this feature with compose_telemetry.yaml file.
350368
351369
OPEA microservice and TGI/TEI serving can easily be traced through Jaeger dashboards in conjunction with OpenTelemetry Tracing feature. Follow the [README](https://github.com/opea-project/GenAIComps/tree/main/comps/cores/telemetry#tracing) to trace additional functions if needed.
352370

0 commit comments

Comments
 (0)