The Long-Term Memory Layer for AI Coding Agents
Give your AI Assistant a brain that persists context, learns from past mistakes, and understands your project's goals.
Quick Start β’ Features β’ Installation β’ Configuration β’ License
MCP-KG-Memory is a production-grade Model Context Protocol (MCP) Server designed to solve the "context amnesia" problem in AI coding assistants (Cursor, Windsurf, VS Code, Antigravity).
Instead of starting from zero every session, this system maintains a persistent Knowledge Graph of your project. It acts as an active memory layer that tracks:
- π― Goals & Status: What are we building? What is done?
- π Constraints & Rules: Architectural decisions, forbidden patterns.
- π‘ Strategies & Outcomes: What worked? What failed? (Automatic learning)
- β€οΈ User Preferences: How do you like your code written?
- π Code Relationships: Semantic links between goals and specific files.
Real-time visualization of the Knowledge Graph memory structure.
Every time you start a task, the agent consults the memory. It automatically retrieves:
- Active goals relevant to your current work.
- Past failed attempts (to avoid repeating them) and successful strategies.
- Your specific coding preferences (SOLID, Clean Arch, etc.).
Don't just grep strings. The system traverses the graph (k-hops) to find connected context. "I'm working on Auth" β Retrieval includes User model, JWT utility, and the relevant security constraints defined 2 weeks ago.
The system isn't static. It learns:
- Implicit Learning: Infers strategies and patterns from your conversations.
- Outcome Tracking: Remembers if a strategy "Success" or "Failure" to guide future decisions.
- Core: Python 3.11+
- Database: Neo4j (Graph Database)
- LLM: Google Gemini 2.5 (via Direct API or LiteLLM Gateway)
- Protocol: Model Context Protocol (MCP)
You can install kg-mcp globally using pipx (recommended) or in a local virtual environment.
- Python 3.11+
- Docker (for running Neo4j locally)
- Gemini API Key (Get Key from Google AI Studio)
# Install the package
pipx install kg-mcp
# Run the interactive Setup Wizard
kg-mcp-setupThe wizard will:
- Check for Docker and Neo4j.
- Ask for your Gemini API Key.
- Configure the LLM Mode (Direct vs LiteLLM).
- Generate a secure
.envfile.
β Don't have pipx? Click here to install it
macOS:
brew install pipx
pipx ensurepathWindows:
winget install pipx
pipx ensurepathLinux (Debian/Ubuntu):
sudo apt install pipx
pipx ensurepathRestart your terminal after installing pipx.
If you prefer not to use pipx:
pip install kg-mcp
kg-mcp-setup# Clone the repository
git clone https://github.com/Hexecu/mcp-neuralmemory.git
cd mcp-neuralmemory
# Set up environment
cp .env.example .env
# (Edit .env with your credentials)
# Install dependencies
cd server
pip install -e .
# Start Neo4j
docker compose up -d
# Initialize Schema
python -m kg_mcp.kg.apply_schemaTo use this memory server with your AI Editor, add the following configuration to your MCP config file.
Before configuring your editor, run the verification script to ensure everything is Green:
python3 verify_setup.pyAdd this to your mcp_config.json (or mcp.json):
{
"mcpServers": {
"kg-memory": {
"command": "/path/to/your/venv/bin/python",
"args": [
"-m",
"kg_mcp",
"--transport",
"stdio"
],
"env": {
"NEO4J_URI": "bolt://127.0.0.1:7687",
"NEO4J_USER": "neo4j",
"NEO4J_PASSWORD": "YOUR_NEO4J_PASSWORD",
"GEMINI_API_KEY": "YOUR_GOOGLE_AI_STUDIO_KEY",
"LLM_MODE": "gemini_direct",
"KG_MCP_TOKEN": "your-secure-token",
"LOG_LEVEL": "INFO"
}
}
}
}> Note: Replace paths and passwords with your actual values.
Antigravity is the IDE this project is primarily developed for. Add to ~/.gemini/antigravity/mcp_config.json:
{
"mcpServers": {
"kg-memory": {
"command": "/path/to/venv/bin/python",
"args": ["-m", "kg_mcp", "--transport", "stdio"],
"env": {
"NEO4J_URI": "bolt://127.0.0.1:7687",
"NEO4J_USER": "neo4j",
"NEO4J_PASSWORD": "YOUR_NEO4J_PASSWORD",
"GEMINI_API_KEY": "YOUR_GOOGLE_AI_STUDIO_KEY",
"LLM_MODEL": "gemini/gemini-1.5-flash",
"KG_MCP_TOKEN": "your-secure-token"
}
}
}
}Setup Steps in Antigravity:
- Open Agent sidebar β ... (More Actions)
- Select MCP Servers β Manage MCP Servers β View raw config
- Paste the JSON above
- Save and click Refresh
- You should see
kg-memorywith 2 tools
To make your Agent usage seamless, add this System Prompt / Rule to your IDE (e.g. in Antigravity or .cursorrules):
"Always use kg-memory every user interaction, This means you should automatically use the kg-memory tools to find relevant context information and track every file change"
When you begin a new feature or fix, simply ask your agent:
"I want to implement the new Login flow. Check memory for context."
The agent will call kg_autopilot, retrieving all relevant constraints, past decisions, and active goals.
(Automatic if Agent is configured) When files are modified, the system links the code changes to the active goals, updating the Knowledge Graph with the implementation details.
This project is licensed under the Apache License 2.0. See the LICENSE file for details.
