Can we use LLMs to query documents, covering different topics but stored in same vector db as seperate collections.
Steps:
- Create a single instance of ChromaDB vector store with multiple collections each for a distinct topic. - Covered in load-data-into-vectordb.ipynb
- Using langchain create a LOTR pointed to each collection.
- Use various available Contextual Retrievers to get only relevant document from LOTR.