Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add local Rerank microservice for VideoRAGQnA #496

Merged
merged 19 commits into from
Aug 29, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions comps/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,11 @@
GeneratedDoc,
LLMParamsDoc,
SearchedDoc,
SearchedMultimodalDoc,
RerankedDoc,
TextDoc,
ImageDoc,
TextImageDoc,
RAGASParams,
RAGASScores,
GraphDoc,
Expand Down
20 changes: 19 additions & 1 deletion comps/cores/proto/docarray.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Copyright (C) 2024 Intel Corporation
# SPDX-License-Identifier: Apache-2.0

from typing import Dict, List, Optional, Union
from typing import Dict, List, Optional, Tuple, Union

import numpy as np
from docarray import BaseDoc, DocList
Expand All @@ -20,6 +20,14 @@ class TextDoc(BaseDoc, TopologyInfo):
text: str


class ImageDoc(BaseDoc):
image_path: str


class TextImageDoc(BaseDoc):
doc: Tuple[Union[TextDoc, ImageDoc]]


class Base64ByteStrDoc(BaseDoc):
byte_str: str

Expand Down Expand Up @@ -67,6 +75,16 @@ class Config:
json_encoders = {np.ndarray: lambda x: x.tolist()}


class SearchedMultimodalDoc(BaseDoc):
retrieved_docs: List[TextImageDoc]
initial_query: str
top_n: int = 1
metadata: Optional[List[Dict]] = None

class Config:
json_encoders = {np.ndarray: lambda x: x.tolist()}


class GeneratedDoc(BaseDoc):
text: str
prompt: str
Expand Down
62 changes: 62 additions & 0 deletions comps/reranks/video-rag-qna/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
# Rerank Microservice

This is a Docker-based microservice that do result rerank for VideoRAGQnA use case. Local rerank is used rather than rerank model.

For the `VideoRAGQnA` usecase, during the data preparation phase, frames are extracted from videos and stored in a vector database. To identify the most relevant video, we count the occurrences of each video source among the retrieved data with rerank function `get_top_doc`. This sorts the video as a descending list of names, ranked by their degree of match with the query. Then we could send the `top_n` videos to the downstream LVM.

# 🚀1. Start Microservice with Docker

## 1.1 Build Images

```bash
cd GenAIComps
docker build --no-cache -t opea/reranking-videoragqna:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/reranks/video-rag-qna/docker/Dockerfile .
```

## 1.2 Start Rerank Service

```bash
docker compose -f comps/reranks/video-rag-qna/docker/docker_compose_reranking.yaml up -d
# wait until ready
until docker logs reranking-videoragqna-server 2>&1 | grep -q "Uvicorn running on"; do
sleep 2
done
```

Available configuration by environment variable:

- CHUNK_DURATION: target chunk duration, should be aligned with VideoRAGQnA dataprep. Default 10s.

# ✅ 2. Test

```bash
export ip_address=$(hostname -I | awk '{print $1}')
curl -X 'POST' \
"http://${ip_address}:8000/v1/reranking" \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{
"retrieved_docs": [{"doc": [{"text": "this is the retrieved text"}]}],
"initial_query": "this is the query",
"top_n": 1,
"metadata": [
{"other_key": "value", "video":"top_video_name", "timestamp":"20"},
{"other_key": "value", "video":"second_video_name", "timestamp":"40"},
{"other_key": "value", "video":"top_video_name", "timestamp":"20"}
]
}'
```

The result should be:

```bash
{"id":"random number","video_url":"http://0.0.0.0:6005/top_video_name","chunk_start":20.0,"chunk_duration":10.0,"prompt":"this is the query","max_new_tokens":512}
```

# ♻️ 3. Clean

```bash
# remove the container
cid=$(docker ps -aq --filter "name=reranking-videoragqna-server")
if [[ ! -z "$cid" ]]; then docker stop $cid && docker rm $cid && sleep 1s; fi
```
24 changes: 24 additions & 0 deletions comps/reranks/video-rag-qna/docker/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@

# Copyright (C) 2024 Intel Corporation
# SPDX-License-Identifier: Apache-2.0

FROM python:3.11-slim

ENV LANG=C.UTF-8

RUN useradd -m -s /bin/bash user && \
mkdir -p /home/user && \
chown -R user /home/user/

USER user

COPY comps /home/user/comps

RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir -r /home/user/comps/reranks/video-rag-qna/requirements.txt

ENV PYTHONPATH=$PYTHONPATH:/home/user

WORKDIR /home/user/comps/reranks/video-rag-qna

ENTRYPOINT ["python", "local_reranking.py"]
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# Copyright (C) 2024 Intel Corporation
# SPDX-License-Identifier: Apache-2.0

services:
reranking:
image: opea/reranking-videoragqna:latest
container_name: reranking-videoragqna-server
ports:
- "8000:8000"
ipc: host
environment:
no_proxy: ${no_proxy}
http_proxy: ${http_proxy}
https_proxy: ${https_proxy}
CHUNK_DURATION: ${CHUNK_DURATION}
FILE_SERVER_ENDPOINT: ${FILE_SERVER_ENDPOINT}
restart: unless-stopped

networks:
default:
driver: bridge
89 changes: 89 additions & 0 deletions comps/reranks/video-rag-qna/local_reranking.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
# Copyright (C) 2024 Intel Corporation
# SPDX-License-Identifier: Apache-2.0

import logging
import os
import time

from comps import (
LVMVideoDoc,
SearchedMultimodalDoc,
ServiceType,
opea_microservices,
register_microservice,
register_statistics,
statistics_dict,
)

chunk_duration = os.getenv("CHUNK_DURATION", "10") or "10"
chunk_duration = float(chunk_duration) if chunk_duration.isdigit() else 10.0

file_server_endpoint = os.getenv("FILE_SERVER_ENDPOINT") or "http://0.0.0.0:6005"

logging.basicConfig(
level=logging.INFO, format="%(levelname)s: [%(asctime)s] %(message)s", datefmt="%d/%m/%Y %I:%M:%S"
)


def get_top_doc(top_n, videos) -> list:
hit_score = {}
if videos is None:
return None
for video_name in videos:
try:
if video_name not in hit_score.keys():
hit_score[video_name] = 0
hit_score[video_name] += 1
except KeyError as r:
logging.info(f"no video name {r}")

x = dict(sorted(hit_score.items(), key=lambda item: -item[1])) # sorted dict of video name and score
top_n_names = list(x.keys())[:top_n]
logging.info(f"top docs = {x}")
logging.info(f"top n docs names = {top_n_names}")

return top_n_names


def find_timestamp_from_video(metadata_list, video):
return next(
(metadata["timestamp"] for metadata in metadata_list if metadata["video"] == video),
None,
)


@register_microservice(
name="opea_service@reranking_visual_rag",
service_type=ServiceType.RERANK,
endpoint="/v1/reranking",
host="0.0.0.0",
port=8000,
input_datatype=SearchedMultimodalDoc,
output_datatype=LVMVideoDoc,
)
@register_statistics(names=["opea_service@reranking_visual_rag"])
def reranking(input: SearchedMultimodalDoc) -> LVMVideoDoc:
start = time.time()

# get top video name from metadata
video_names = [meta["video"] for meta in input.metadata]
top_video_names = get_top_doc(input.top_n, video_names)

# only use the first top video
timestamp = find_timestamp_from_video(input.metadata, top_video_names[0])
video_url = f"{file_server_endpoint.rstrip('/')}/{top_video_names[0]}"

result = LVMVideoDoc(
video_url=video_url,
prompt=input.initial_query,
chunk_start=timestamp,
chunk_duration=float(chunk_duration),
max_new_tokens=512,
)
statistics_dict["opea_service@reranking_visual_rag"].append_latency(time.time() - start, None)

return result


if __name__ == "__main__":
opea_microservices["opea_service@reranking_visual_rag"].start()
11 changes: 11 additions & 0 deletions comps/reranks/video-rag-qna/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
datasets
docarray
fastapi
opentelemetry-api
opentelemetry-exporter-otlp
opentelemetry-sdk
Pillow
prometheus-fastapi-instrumentator
pydub
shortuuid
uvicorn
78 changes: 78 additions & 0 deletions tests/test_reranks_video-rag-qna.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
#!/bin/bash
# Copyright (C) 2024 Intel Corporation
# SPDX-License-Identifier: Apache-2.0

set -xe

WORKPATH=$(dirname "$PWD")
ip_address=$(hostname -I | awk '{print $1}')

function build_docker_images() {
cd $WORKPATH
docker build --no-cache -t opea/reranking-videoragqna:latest --build-arg https_proxy=$https_proxy --build-arg http_proxy=$http_proxy -f comps/reranks/video-rag-qna/docker/Dockerfile .
}

function start_service() {
docker run -d --name "test-comps-reranking-videoragqna-server" \
-p 5037:8000 \
--ipc=host \
-e no_proxy=${no_proxy} \
-e http_proxy=${http_proxy} \
-e https_proxy=${https_proxy} \
-e CHUNK_DURATION=${CHUNK_DURATION} \
-e FILE_SERVER_ENDPOINT=${FILE_SERVER_ENDPOINT} \
opea/reranking-videoragqna:latest


until docker logs test-comps-reranking-videoragqna-server 2>&1 | grep -q "Uvicorn running on"; do
sleep 2
done
}

function validate_microservice() {
result=$(\
http_proxy="" \
curl -X 'POST' \
"http://${ip_address}:5037/v1/reranking" \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{
"retrieved_docs": [
{"doc": [{"text": "this is the retrieved text"}]}
],
"initial_query": "this is the query",
"top_n": 1,
"metadata": [
{"other_key": "value", "video":"top_video_name", "timestamp":"20"},
{"other_key": "value", "video":"second_video_name", "timestamp":"40"},
{"other_key": "value", "video":"top_video_name", "timestamp":"20"}
]
}')
if [[ $result == *"this is the query"* ]]; then
echo "Result correct."
else
echo "Result wrong."
exit 1
fi
}

function stop_docker() {
cid=$(docker ps -aq --filter "name=test-comps-reranking*")
if [[ ! -z "$cid" ]]; then docker stop $cid && docker rm $cid && sleep 1s; fi
}

function main() {

stop_docker

build_docker_images
start_service

validate_microservice

stop_docker
echo y | docker system prune

}

main
Loading