Skip to content

Feat/add tool - cognee #238

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 9 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
136 changes: 136 additions & 0 deletions agentstack/_tools/cognee/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,136 @@
"""
Implementation of the cognee for AgentStack.
These functions wrap cognee's asynchronous methods and expose them
as synchronous functions with typed parameters & docstrings for use by AI agents.
"""

import cognee
import asyncio
from cognee.api.v1.search import SearchType


def prune_data(metadata: bool = False) -> str:
"""
Prune the cognee data. If metadata is True, also prune system metadata.

:param metadata: Whether to prune system metadata as well.
:return: A confirmation message.
"""

async def _prune():
await cognee.prune.prune_data()
if metadata:
await cognee.prune.prune_system(metadata=True)
return "Data pruned successfully."

return asyncio.run(_prune())


def add_data(data) -> str:
"""
Add any type of data to cognee's knowledge system for future 'cognify' operations.

:return: A confirmation message.
"""

async def _add():
await cognee.add(data)
return "Data added successfully."

return asyncio.run(_add())


def cognify() -> str:
"""
Run cognee's 'cognify' pipeline to build the knowledge graph,
summaries, and other metadata from previously added text.

:return: A confirmation message.
"""

async def _cognify():
await cognee.cognify()
return "Cognify process complete."

return asyncio.run(_cognify())


def search_insights(query_text: str) -> str:
"""
Perform an INSIGHTS search on the knowledge graph for the given query text.

:param query_text: The query to search for.
:return: The search results as a (stringified) list of matches.
"""

async def _search():
results = await cognee.search(SearchType.INSIGHTS, query_text=query_text)
return str(results)

return asyncio.run(_search())


def search_summaries(query_text: str) -> str:
"""
Perform a SUMMARIES search on the knowledge graph for the given query text.

:param query_text: The query to search for.
:return: The search results as a (stringified) list of matches.
"""

async def _search():
results = await cognee.search(SearchType.SUMMARIES, query_text=query_text)
return str(results)

return asyncio.run(_search())


def search_chunks(query_text: str) -> str:
"""
Perform a CHUNKS search on the knowledge graph for the given query text.

:param query_text: The query to search for.
:return: The search results as a (stringified) list of matches.
"""

async def _search():
results = await cognee.search(SearchType.CHUNKS, query_text=query_text)
return str(results)

return asyncio.run(_search())


def search_completion(query_text: str) -> str:
"""
Perform a COMPLETION search on the knowledge graph for the given query text.

This function retrieves the document chunk most relevant to the user's query
and prompts the LLM to provide an answer using this context.

:param query_text: The query to search for.
:return: The search results as a (stringified) list of matches.
"""

async def _search():
results = await cognee.search(SearchType.COMPLETION, query_text=query_text)
return str(results)

return asyncio.run(_search())


def search_graph_completion(query_text: str) -> str:
"""
Perform a GRAPH_COMPLETION search on the knowledge graph for the given query text.

This function identifies the most relevant knowledge graph entities, including document chunks,
related to the user's query and prompts the LLM to generate a response with this enriched context.

:param query_text: The query to search for.
:return: The search results as a (stringified) list of matches.
"""

async def _search():
results = await cognee.search(SearchType.GRAPH_COMPLETION, query_text=query_text)
return str(results)

return asyncio.run(_search())
45 changes: 45 additions & 0 deletions agentstack/_tools/cognee/config.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
{
"name": "cognee",
"category": "Memory",
"tools": [
"prune_data",
"add_data",
"cognify",
"search_insights",
"search_summaries",
"search_chunks",
"search_completion",
"search_graph_completion"
],
"url": "https://github.com/topoteretes/cognee",
"tools_bundled": true,
"cta": "Cognee installed! Please set your cognee env variables.",

"env": {
"ENV": "local",
"TOKENIZERS_PARALLELISM": "false",
"LLM_API_KEY": "",
"LLM_MODEL": "openai/gpt-4o-mini",
"LLM_PROVIDER": "openai",
"LLM_ENDPOINT": "",
"LLM_API_VERSION": "",
"EMBEDDING_PROVIDER": "openai",
"EMBEDDING_API_KEY": "",
"EMBEDDING_MODEL": "openai/text-embedding-3-large",
"EMBEDDING_ENDPOINT": "",
"EMBEDDING_API_VERSION": "",
"EMBEDDING_DIMENSIONS": 3072,
"EMBEDDING_MAX_TOKENS": 8191,
"GRAPH_DATABASE_PROVIDER": "networkx",
"VECTOR_DB_PROVIDER": "lancedb",
"DB_PROVIDER": "sqlite",
"DB_NAME": "cognee_db",
},

"packages": [
"cognee"
],
"post_install": "Now, you can start cognifying!",
"post_remove": "Cognee is removed!"
}

1 change: 1 addition & 0 deletions docs/tools/community.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ description: 'AgentStack tools from community contributors'
## Memory / State

- [Mem0](/tools/tool/mem0)
- [cognee](/tools/tool/cognee)

## Code Execution

Expand Down
56 changes: 56 additions & 0 deletions docs/tools/tool/cognee.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
---
title: 'cognee'
description: 'AI Memory Engine for Enhanced LLM Accuracy'
---

## Overview

[cognee](https://www.cognee.ai/) is an AI memory engine designed to improve the accuracy of Large Language Models (LLMs). It connects data points to uncover hidden links, providing your LLM applications with a better understanding of your data, leading to more reliable responses.

## Installation

1. To install the `cognee` tool in AgentStack, run:

```bash
agentstack tools add cognee

2. Set Environment Variables:
Configure the necessary environment variables in your environment or .env file. For the default setup you only need to add your `LLM_API_KEY`. Adjust other configurations as needed based on your setup.

## Usage
Cognee provides a suite of tools to manage and utilize AI memory effectively. Below is a set of tools available to AgentStack and their functionalities:

### Available Tools
- prune_data: Cleanses the cognee data store, with an option to include system metadata.
- add_data: Adds any data to cognee's data store for future processing.
- cognify: Processes the added data to build a knowledge graph, summaries, and other metadata.
- search_insights: Performs an insights search on the knowledge graph based on a query.
- search_summaries: Searches for summaries related to the query in the knowledge graph.
- search_chunks: Searches for specific chunks of data related to the query.
- search_completion: Retrieves the document chunk most relevant to the user's query and prompts the LLM to provide an answer using this context.
- search_graph_completion: Identifies the most related knowledge graph entities, including document chunks, related to the user's query and prompts the LLM to generate a response with this enriched context.


For more features, please visit [cognee's repo](https://github.com/topoteretes/cognee)
Cognee can be configured for different behaviors by modifying /agentstack/_tools/cognee/__init__.py


### Example

#### Using cognee for your Agent's tasks

**Description:**

1. **Prune Data:** Start by pruning existing data in cognee to ensure a clean slate.
2. **Add Data:** Provide text into cognee's knowledge base. Example: "Natural language processing (NLP) is an interdisciplinary subfield of computer science and information retrieval."
3. **Cognify:** Process the added text to build the knowledge graph and related metadata.
4. **Search:** Perform searches using cognee's search functions with a query. Example: "Tell me about NLP". Present each search result separately, specifying the function name used for each search type.

**Expected Output:**

The agent should retrieve answers from the provided text based on the query.



For more detailed information and advanced configurations, refer to the official [cognee documentation](https://docs.cognee.ai/).

Loading