by topoteretes
Provides dynamic memory for AI agents through modular ECL (Extract, Cognify, Load) pipelines, enabling seamless integration with graph and vector stores using minimal code.
Cognee enables AI agents to store, retrieve, and reason over past interactions, documents, images, and audio transcriptions. It replaces traditional Retrieval‑Augmented Generation (RAG) workflows with a unified, low‑code solution that builds and queries knowledge graphs and vector databases.
pip install cognee
.env
file:import os
os.environ["LLM_API_KEY"] = "YOUR_OPENAI_API_KEY"
import cognee, asyncio
async def main():
await cognee.add("Natural language processing (NLP) is ...")
await cognee.cognify()
results = await cognee.search("Tell me about NLP")
for r in results:
print(r)
asyncio.run(main())
Q: Which LLM providers are supported? A: Any provider that offers an API key compatible with the OpenAI request schema; local models via Ollama are also supported.
Q: Do I need a vector database separate from the graph store? A: Cognee can sync embeddings to both a graph database (e.g., Neo4j) and a vector store; you can choose one or both.
Q: How do I set environment variables without editing code?
A: Use a .env
file based on the provided .env.template
or export variables in your shell.
Q: Is there a hosted version?
A: Yes, the Cogwit beta offers a fully‑hosted AI memory service at https://platform.cognee.ai/
.
Q: Can I extend the ingestion pipeline? A: The ECL architecture is modular; you can add custom extractors, transformers, or loaders via Python plugins.
cognee - Memory for AI Agents in 5 lines of code
🚀 We launched Cogwit beta (Fully-hosted AI Memory): Sign up here! 🚀
Build dynamic memory for Agents and replace RAG using scalable, modular ECL (Extract, Cognify, Load) pipelines.
Get started quickly with a Google Colab notebook , Deepnote notebook or starter repo
Your contributions are at the core of making this a true open source project. Any contributions you make are greatly appreciated. See CONTRIBUTING.md
for more information.
You can install Cognee using either pip, poetry, uv or any other python package manager. Cognee supports Python 3.10 to 3.13
pip install cognee
You can install the local Cognee repo using pip, poetry and uv. For local pip installation please make sure your pip version is above version 21.3.
uv sync --all-extras
import os
os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"
You can also set the variables by creating .env file, using our template. To use different LLM providers, for more info check out our documentation
This script will run the default pipeline:
import cognee
import asyncio
async def main():
# Add text to cognee
await cognee.add("Natural language processing (NLP) is an interdisciplinary subfield of computer science and information retrieval.")
# Generate the knowledge graph
await cognee.cognify()
# Query the knowledge graph
results = await cognee.search("Tell me about NLP")
# Display the results
for result in results:
print(result)
if __name__ == '__main__':
asyncio.run(main())
Example output:
Natural Language Processing (NLP) is a cross-disciplinary and interdisciplinary field that involves computer science and information retrieval. It focuses on the interaction between computers and human language, enabling machines to understand and process natural language.
You can also cognify your files and query using cognee UI.
Try cognee UI out locally here.
We are committed to making open source an enjoyable and respectful experience for our community. See CODE_OF_CONDUCT for more information.
Thanks to the following companies for sponsoring the ongoing development of cognee.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
A basic implementation of persistent memory using a local knowledge graph. This lets Claude remember information about the user across chats.
by basicmachines-co
Enables persistent, local‑first knowledge management by allowing LLMs to read and write Markdown files during natural conversations, building a traversable knowledge graph that stays under the user’s control.
by smithery-ai
Provides read and search capabilities for Markdown notes in an Obsidian vault for Claude Desktop and other MCP clients.
by chatmcp
Summarize chat messages by querying a local chat database and returning concise overviews.
by dmayboroda
Provides on‑premises conversational retrieval‑augmented generation (RAG) with configurable Docker containers, supporting fully local execution, ChatGPT‑based custom GPTs, and Anthropic Claude integration.
by GreatScottyMac
Provides a project‑specific memory bank that stores decisions, progress, architecture, and custom data, exposing a structured knowledge graph via MCP for AI assistants and IDE tools.
by andrea9293
Provides document management and AI-powered semantic search for storing, retrieving, and querying text, markdown, and PDF files locally without external databases.
by scorzeth
Provides a local MCP server that interfaces with a running Anki instance to retrieve, create, and update flashcards through standard MCP calls.
by sirmews
Read and write records in a Pinecone vector index via Model Context Protocol, enabling semantic search and document management for Claude Desktop.