Quick Start |
Documentation |
LangChain Support |
Discord
www.getzep.com
- Understands chat messages, roles, and user metadata, not just texts and embeddings.
- Zep Memory and VectorStore implementations are shipped with your favorite frameworks: LangChain, LangChain.js, LlamaIndex, and more.
- Populate your prompts with relevant documents and chat history.
- Rich metadata and JSONPath query filters offer a powerful hybrid search over texts.
- Automatically embed texts and messages using state-of-the-art opeb source models, OpenAI, or bring your own vectors.
- Enrichment of chat histories with summaries, named entities, token counts. Use these as search filters.
- Associate your own metadata with sessions, documents & chat histories.
- Zep’s local embedding models and async enrichment ensure a snappy user experience.
- Storing documents and history in Zep and not in memory enables stateless deployment.
- Python & TypeScript/JS SDKs for easy integration with your LLM app.
- TypeScript/JS SDK supports edge deployment.
- 🏎️ Quick Start Guide: Docker or cloud deployment, and coding, in < 5 minutes.
- 🦙 Building Apps with LlamaIndex
- 🦜⛓️ Building Apps with LangChain
- 🛠️ Getting Started with TypeScript/JS or Python
- 🔑 Key Concepts
const query = "The celestial motions are nothing but a continual";
const searchResults = await collection.search({ text: query }, 3);
// Search for documents using both text and metadata
const metadataQuery = {
where: { jsonpath: '$[*] ? (@.bar == "qux")' },
};
const newSearchResults = await collection.search(
{
text: query,
metadata: metadataQuery,
},
3
);
# Search by embedding vector, rather than text query
# embedding is a list of floats
results = collection.search(
embedding=embedding, limit=5
)
session_id = "2a2a2a"
history = [
{ role: "human", content: "Who was Octavia Butler?" },
{
role: "ai",
content:
"Octavia Estelle Butler (June 22, 1947 – February 24, 2006) was an American" +
" science fiction author.",
},
{
role: "human",
content: "Which books of hers were made into movies?",
metadata={"foo": "bar"},
}
]
messages = [Message(role=m.role, content=m.content) for m in history]
memory = Memory(messages=messages)
result = await client.aadd_memory(session_id, memory)
const memory = new ZepMemory({
sessionId,
baseURL: zepApiURL,
apiKey: zepApiKey,
});
const chain = new ConversationChain({ llm: model, memory });
const response = await chain.run(
{
input="What is the book's relevance to the challenges facing contemporary society?"
},
);
Please see the Zep Quick Start Guide for important configuration information.
docker compose up
Looking for other deployment options?
Please see the Zep Develoment Guide for important beta information and usage instructions.
pip install zep-python
or
npm i @getzep/zep-js