Skip to content

Automatic DB + RAG + LLM + Chat UI using docker compose and task

Notifications You must be signed in to change notification settings

rodrigobdz/sponge

 
 

Repository files navigation

Sponge

A simple way to start up a locally hosted environment that provides a web interface to interact with an LLM using RAG.

TODO:

  • implement the the rag interceptor between chat-ui and mistral/ollama
  • implement a way to load data into the pgai db from:
    • websites
    • local pdfs
    • local text
  • have a browser extension that will load things into the db, linked back to the source
  • be able to print to it?
  • emails?
  • etc

About

Automatic DB + RAG + LLM + Chat UI using docker compose and task

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published