A simple way to start up a locally hosted environment that provides a web interface to interact with an LLM using RAG.
- implement the the rag interceptor between chat-ui and mistral/ollama
- implement a way to load data into the pgai db from:
- websites
- local pdfs
- local text
- have a browser extension that will load things into the db, linked back to the source
- be able to print to it?
- emails?
- etc