Skip to content

Full stack advanced chatbot over LlamaIndex.TS documentation with preview feature using Multi-documents-agents, bootstrapped with create-llama

Notifications You must be signed in to change notification settings

DakineMI/llamaindex-docs-agent

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Advanced chatbot over LlamaIndex TS documentation 🔥

llamats_docs_demo_compressed.mp4

Multi-document Agents

This is a LlamaIndex project bootstrapped with create-llama.

This multi-document agent is built over the LlamaIndex.TS documentation.

We use our multi-document agent architecture:

  • Individual query engine per document
  • Top level Orchestrator agent across documents that can pick relevant subsets

This also streams all intermediate results from the agent via a custom Callback handler.

We use this Custom Callback handler to also send intermediate nodes that are retrieved during retrieval of document level query engines, to the frontend.

It allows us to show the relevant section of the documentation in the preview window.

Main Files to Look At

This extends beyond the simple create-llama example. To see changes, look at the following files:

  • backend/app/utils/index.py - contains core logic for constructing + getting multi-doc agent
  • backend/app/api/routers/chat.py - contains implementation of chat endpoint + threading to stream intermediate responses.

We also created some custom Transformations that we use with out robust IngestionPipeline

As we update the documentations in the data folder, this IngestionPipeline takes care of handling duplicates, applying our custom nodes transformation logic etc.

The custom transformations we've used:

  • Deduplicator - handles duplicates.
  • HyperlinksRemover - cleans the markdown files.
  • Summarizer - creates summary of the node and adds that as a metadata.
  • URLExtractor - generates the url of a particular node section.
  • Upserter - updates the docstore with new and updated nodes, deletes old ones.

Getting Started

First, startup the backend as described in the backend README.

Second, run the development server of the frontend as described in the frontend README.

Open http://localhost:3000 with your browser to see the result.

Learn More

To learn more about LlamaIndex, take a look at the following resources:

You can check out the LlamaIndexTS GitHub repository - your feedback and contributions are welcome!

About

Full stack advanced chatbot over LlamaIndex.TS documentation with preview feature using Multi-documents-agents, bootstrapped with create-llama

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 33.7%
  • MDX 32.4%
  • TypeScript 31.1%
  • CSS 2.4%
  • Other 0.4%