You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: ChatQnA/README.md
+19-1Lines changed: 19 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -90,6 +90,13 @@ cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
90
90
# cd GenAIExamples/ChatQnA/docker_compose/nvidia/gpu/
91
91
docker compose up -d
92
92
```
93
+
To enable Open Telemetry Tracing, compose_telemetry.yaml file need to be merged along with default compose.yaml file.
94
+
CPU example with Open Telemetry feature:
95
+
96
+
```bash
97
+
cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
98
+
docker compose -f compose.yaml -f compose_telemetry.yaml up -d
99
+
```
93
100
94
101
It will automatically download the docker image on `docker hub`:
95
102
@@ -232,6 +239,12 @@ cd GenAIExamples/ChatQnA/docker_compose/intel/hpu/gaudi/
232
239
docker compose up -d
233
240
```
234
241
242
+
To enable Open Telemetry Tracing, compose_telemetry.yaml file need to be merged along with default compose.yaml file.
243
+
```bash
244
+
cd GenAIExamples/ChatQnA/docker_compose/intel/hpu/gaudi/
245
+
docker compose -f compose.yaml -f compose_telemetry.yaml up -d
246
+
```
247
+
235
248
Refer to the [Gaudi Guide](./docker_compose/intel/hpu/gaudi/README.md) to build docker images from source.
236
249
237
250
### Deploy ChatQnA on Xeon
@@ -242,6 +255,11 @@ Find the corresponding [compose.yaml](./docker_compose/intel/cpu/xeon/compose.ya
242
255
cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
243
256
docker compose up -d
244
257
```
258
+
To enable Open Telemetry Tracing, compose_telemetry.yaml file need to be merged along with default compose.yaml file.
259
+
```bash
260
+
cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
261
+
docker compose -f compose.yaml -f compose_telemetry.yaml up -d
262
+
```
245
263
246
264
Refer to the [Xeon Guide](./docker_compose/intel/cpu/xeon/README.md) for more instructions on building docker images from source.
247
265
@@ -346,7 +364,7 @@ OPEA microservice deployment can easily be monitored through Grafana dashboards
346
364
347
365
## Tracing Services with OpenTelemetry Tracing and Jaeger
348
366
349
-
> NOTE: limited support. Only LLM inference serving with TGI on Gaudi is enabled for this feature.
367
+
> NOTE: This feature is disabled by default. Please check the Deploy ChatQnA sessions for how to enable this feature with compose_telemetry.yaml file.
350
368
351
369
OPEA microservice and TGI/TEI serving can easily be traced through Jaeger dashboards in conjunction with OpenTelemetry Tracing feature. Follow the [README](https://github.com/opea-project/GenAIComps/tree/main/comps/cores/telemetry#tracing) to trace additional functions if needed.
0 commit comments