A lightweight, no-frills RESTful API designed for managing knowledge bases and files stored in vector databases—no GPU, internet, or cloud services required! LocalRecall provides a simple and generic abstraction layer to handle knowledge retrieval, ideal for AI agents and chatbots to manage both long-term and short-term memory seamlessly.
Currently, LocalRecall is batteries included and supports a local vector store powered by Chromem, with plans to add additional vector stores such as Milvus and Qdrant. It can easily integrate with LocalAI, LocalAGI, and other agent frameworks, offering an intuitive web UI for convenient file management, including support for raw text inputs.
🆕 LocalAI is now part of a comprehensive suite of AI tools designed to work together:
- ⚡ RESTful API: Simple and intuitive REST interface for knowledge management.
- 📡 Fully Local: Operates offline without external cloud dependencies.
- 📚 RAG Knowledgebase: Retrieve-Augmented Generation (RAG) compatible with multiple vector databases.
- 🗃️ Memory Management: Ideal for AI-driven applications requiring memory abstraction.
- 📂 File Support:
- ✅ Markdown
- ✅ Plain Text
- ⏳ More formats coming soon!
- Go 1.16 or higher
- Docker (optional, for containerized deployment)
git clone https://github.com/mudler/LocalRecall.git
cd LocalRecall
go build -o localrecall
./localrecall
Your web UI will be available at http://localhost:8080
.
Build and run using Docker:
docker build -t localrecall .
docker run -ti -v $PWD/state:/state \
-e COLLECTION_DB_PATH=/state/db \
-e EMBEDDING_MODEL=granite-embedding-107m-multilingual \
-e FILE_ASSETS=/state/assets \
-e OPENAI_API_KEY=sk-1234567890 \
-e OPENAI_BASE_URL=http://localai:8080 \
-p 8080:8080 localrecall
# Or use the images already built by the CI:
docker run -ti -v $PWD/state:/state \
-e COLLECTION_DB_PATH=/state/db \
-e EMBEDDING_MODEL=granite-embedding-107m-multilingual \
-e FILE_ASSETS=/state/assets \
-e OPENAI_API_KEY=sk-1234567890 \
-e OPENAI_BASE_URL=http://localai:8080 \
-p 8080:8080 quay.io/mudler/localrecall
or with Docker compose
docker compose up -d
LocalRecall uses environment variables to configure its behavior. These variables allow you to customize paths, models, and integration settings without modifying the code.
Variable | Description |
---|---|
COLLECTION_DB_PATH |
Path to the vector database directory where collections are stored. |
EMBEDDING_MODEL |
Name of the embedding model used for vectorization (e.g., granite-embedding-107m-multilingual ). |
FILE_ASSETS |
Directory path to store and retrieve uploaded file assets. |
OPENAI_API_KEY |
API key for embedding services (such as LocalAI or OpenAI-compatible APIs). |
OPENAI_BASE_URL |
Base URL for the embedding model API (commonly http://localai:8080 ). |
LISTENING_ADDRESS |
Address the server listens on (default: :8080 ). Useful for deployments on custom ports or network interfaces. |
VECTOR_ENGINE |
Vector database engine to use (chromem by default; support for others like Milvus and Qdrant planned). |
MAX_CHUNKING_SIZE |
Maximum size (in characters) for breaking down documents into chunks. Affects performance and accuracy. |
API_KEYS |
Comma-separated list of API keys for securing access to the REST API (optional). |
These variables can be passed directly when running the binary or inside your Docker container for easy configuration.
Base URL: http://localhost:8080/api
- Create Collection:
curl -X POST $BASE_URL/collections \
-H "Content-Type: application/json" \
-d '{"name":"myCollection"}'
- Upload File:
curl -X POST $BASE_URL/collections/myCollection/upload \
-F "file=@/path/to/file.txt"
- List Collections:
curl -X GET $BASE_URL/collections
- List Files in Collection:
curl -X GET $BASE_URL/collections/myCollection/entries
- Search Collection:
curl -X POST $BASE_URL/collections/myCollection/search \
-H "Content-Type: application/json" \
-d '{"query":"search term", "max_results":5}'
- Reset Collection:
curl -X POST $BASE_URL/collections/myCollection/reset
- Delete Entry:
curl -X DELETE $BASE_URL/collections/myCollection/entry/delete \
-H "Content-Type: application/json" \
-d '{"entry":"file.txt"}'
Released under the MIT License.
We welcome contributions! Please feel free to:
- ✅ Open an issue for suggestions or bugs.
- ✅ Submit a pull request with enhancements.