You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: ChatQnA/README.md
+23-1Lines changed: 23 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -91,6 +91,14 @@ cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
91
91
docker compose up -d
92
92
```
93
93
94
+
To enable Open Telemetry Tracing, compose.telemetry.yaml file need to be merged along with default compose.yaml file.
95
+
CPU example with Open Telemetry feature:
96
+
97
+
```bash
98
+
cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
99
+
docker compose -f compose.yaml -f compose.telemetry.yaml up -d
100
+
```
101
+
94
102
It will automatically download the docker image on `docker hub`:
95
103
96
104
```bash
@@ -232,6 +240,13 @@ cd GenAIExamples/ChatQnA/docker_compose/intel/hpu/gaudi/
232
240
docker compose up -d
233
241
```
234
242
243
+
To enable Open Telemetry Tracing, compose.telemetry.yaml file need to be merged along with default compose.yaml file.
244
+
245
+
```bash
246
+
cd GenAIExamples/ChatQnA/docker_compose/intel/hpu/gaudi/
247
+
docker compose -f compose.yaml -f compose.telemetry.yaml up -d
248
+
```
249
+
235
250
Refer to the [Gaudi Guide](./docker_compose/intel/hpu/gaudi/README.md) to build docker images from source.
236
251
237
252
### Deploy ChatQnA on Xeon
@@ -243,6 +258,13 @@ cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
243
258
docker compose up -d
244
259
```
245
260
261
+
To enable Open Telemetry Tracing, compose.telemetry.yaml file need to be merged along with default compose.yaml file.
262
+
263
+
```bash
264
+
cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
265
+
docker compose -f compose.yaml -f compose.telemetry.yaml up -d
266
+
```
267
+
246
268
Refer to the [Xeon Guide](./docker_compose/intel/cpu/xeon/README.md) for more instructions on building docker images from source.
247
269
248
270
### Deploy ChatQnA on NVIDIA GPU
@@ -346,7 +368,7 @@ OPEA microservice deployment can easily be monitored through Grafana dashboards
346
368
347
369
## Tracing Services with OpenTelemetry Tracing and Jaeger
348
370
349
-
> NOTE: limited support. Only LLM inference serving with TGI on Gaudi is enabled for this feature.
371
+
> NOTE: This feature is disabled by default. Please check the Deploy ChatQnA sessions for how to enable this feature with compose_telemetry.yaml file.
350
372
351
373
OPEA microservice and TGI/TEI serving can easily be traced through Jaeger dashboards in conjunction with OpenTelemetry Tracing feature. Follow the [README](https://github.com/opea-project/GenAIComps/tree/main/comps/cores/telemetry#tracing) to trace additional functions if needed.
0 commit comments