Skip to content

Commit

Permalink
[Doc] Update ChatQnA flow chart (#542)
Browse files Browse the repository at this point in the history
* Update flow chart

Signed-off-by: Wang, Xigui <xigui.wang@intel.com>

* Updated Flowchart

Signed-off-by: srinarayan-srikanthan <srinarayan.srikanthan@intel.com>

---------

Signed-off-by: Wang, Xigui <xigui.wang@intel.com>
Signed-off-by: srinarayan-srikanthan <srinarayan.srikanthan@intel.com>
Co-authored-by: Louie Tsai <louie.tsai@intel.com>
(cherry picked from commit dad8eb4)
  • Loading branch information
xiguiw authored and chensuyue committed Aug 27, 2024
1 parent dab0177 commit 4d59721
Show file tree
Hide file tree
Showing 2 changed files with 86 additions and 1 deletion.
85 changes: 84 additions & 1 deletion ChatQnA/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,90 @@ ChatQnA architecture shows below:

ChatQnA is implemented on top of [GenAIComps](https://github.com/opea-project/GenAIComps), the ChatQnA Flow Chart shows below:

![Flow Chart](./assets/img/chatqna_flow_chart.png)
```mermaid
---
config:
flowchart:
nodeSpacing: 100
rankSpacing: 100
curve: linear
theme: base
themeVariables:
fontSize: 42px
---
flowchart LR
%% Colors %%
classDef blue fill:#ADD8E6,stroke:#ADD8E6,stroke-width:2px,fill-opacity:0.5
classDef orange fill:#FBAA60,stroke:#ADD8E6,stroke-width:2px,fill-opacity:0.5
classDef orchid fill:#C26DBC,stroke:#ADD8E6,stroke-width:2px,fill-opacity:0.5
classDef invisible fill:transparent,stroke:transparent;
style ChatQnA-MegaService stroke:#000000
%% Subgraphs %%
subgraph ChatQnA-MegaService["ChatQnA-MegaService"]
direction LR
EM([Embedding <br>]):::blue
RET([Retrieval <br>]):::blue
RER([Rerank <br>]):::blue
LLM([LLM <br>]):::blue
end
subgraph User Interface
direction TB
a([User Input Query]):::orchid
Ingest([Ingest data]):::orchid
UI([UI server<br>]):::orchid
end
subgraph ChatQnA GateWay
direction LR
invisible1[ ]:::invisible
GW([ChatQnA GateWay<br>]):::orange
end
subgraph .
X([OPEA Micsrservice]):::blue
Y{{Open Source Service}}
Z([OPEA Gateway]):::orange
Z1([UI]):::orchid
end
TEI_RER{{Reranking service<br>'TEI'<br>}}
TEI_EM{{Embedding service <br>'TEI LangChain'<br>}}
VDB{{Vector DB<br>'Redis'<br>}}
R_RET{{Retriever service <br>'LangChain Redis'<br>}}
DP([Data Preparation<br>'LangChain Redis'<br>]):::blue
LLM_gen{{LLM Service <br>'TGI'<br>}}
%% Data Preparation flow
%% Ingest data flow
direction LR
Ingest[Ingest data] -->|a| UI
UI -->|b| DP
DP <-.->|c| TEI_EM
%% Questions interaction
direction LR
a[User Input Query] -->|1| UI
UI -->|2| GW
GW <==>|3| ChatQnA-MegaService
EM ==>|4| RET
RET ==>|5| RER
RER ==>|6| LLM
%% Embedding service flow
direction TB
EM <-.->|3'| TEI_EM
RET <-.->|4'| R_RET
RER <-.->|5'| TEI_RER
LLM <-.->|6'| LLM_gen
direction TB
%% Vector DB interaction
R_RET <-.->|d|VDB
DP <-.->|d|VDB
```

This ChatQnA use case performs RAG using LangChain, Redis VectorDB and Text Generation Inference on Intel Gaudi2 or Intel XEON Scalable Processors. The Intel Gaudi2 accelerator supports both training and inference for deep learning models in particular for LLMs. Visit [Habana AI products](https://habana.ai/products) for more details.

Expand Down
2 changes: 2 additions & 0 deletions DocSum/docker/xeon/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,8 @@ Build the frontend Docker image via below command:
cd GenAIExamples/DocSum/docker/ui/
export BACKEND_SERVICE_ENDPOINT="http://${host_ip}:8888/v1/docsum"
docker build -t opea/docsum-react-ui:latest --build-arg BACKEND_SERVICE_ENDPOINT=$BACKEND_SERVICE_ENDPOINT -f ./docker/Dockerfile.react .

docker build -t opea/docsum-react-ui:latest --build-arg BACKEND_SERVICE_ENDPOINT=$BACKEND_SERVICE_ENDPOINT --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f ./docker/Dockerfile.react .
```

Then run the command `docker images`, you will have the following Docker Images:
Expand Down

0 comments on commit 4d59721

Please sign in to comment.