by modelcontextprotocol
A basic implementation of persistent memory using a local knowledge graph. This lets Claude remember information about the user across chats.
The servers
project provides a basic implementation of persistent memory using a local knowledge graph. It allows AI models, specifically mentioning Claude, to remember information about users across different chat sessions. The core of the project revolves around managing entities, relations, and observations within a knowledge graph structure.
To use the servers
project, you can integrate it into your Claude Desktop configuration. This involves adding a JSON block to your claude_desktop_config.json
file, specifying either Docker or NPX as the command to run the memory server.
Docker Setup:
{
"mcpServers": {
"memory": {
"command": "docker",
"args": ["run", "-i", "-v", "claude-memory:/app/dist", "--rm", "mcp/memory"]
}
}
}
NPX Setup:
{
"mcpServers": {
"memory": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-memory"
]
}
}
}
You can also configure the memory storage path using environment variables. For VS Code integration, one-click installation buttons are provided for both VS Code and VS Code Insiders, supporting both NPX and Docker installations.
A system prompt example is provided for chat personalization, guiding the AI on how to identify users, retrieve and update memory during conversations.
create_entities
, create_relations
, add_observations
delete_entities
, delete_observations
, delete_relations
read_graph
, search_nodes
, open_nodes
Q1: How is information stored in the knowledge graph? A1: Information is stored using three main components: Entities (nodes with names and types), Relations (directed links between entities), and Observations (specific facts attached to entities).
Q2: What are the available API operations? A2: The available operations include creating/deleting entities, relations, and observations, as well as reading the entire graph, searching for nodes, and retrieving specific nodes.
Q3: How can I integrate this memory server with Claude Desktop?
A3: You can integrate it by adding a specific JSON configuration to your claude_desktop_config.json
file, choosing between Docker or NPX for running the server.
Q4: Can I customize where the memory is stored?
A4: Yes, you can specify a custom path for memory storage using the MEMORY_FILE_PATH
environment variable.
Q5: What kind of information should be stored as observations? A5: Observations should be atomic facts about an entity, such as "Speaks fluent Spanish" or "Graduated in 2019".
Docker build command: ```sh docker build -t mcp/memory -f src/memory/Dockerfile . ```
The MCP server is licensed under the MIT License.
A basic implementation of persistent memory using a local knowledge graph. This lets Claude remember information about the user across chats.
Entities are the primary nodes in the knowledge graph. Each entity has:
Example:
{
"name": "John_Smith",
"entityType": "person",
"observations": ["Speaks fluent Spanish"]
}
Relations define directed connections between entities. They are always stored in active voice and describe how entities interact or relate to each other.
Example:
{
"from": "John_Smith",
"to": "Anthropic",
"relationType": "works_at"
}
Observations are discrete pieces of information about an entity. They are:
Example:
{
"entityName": "John_Smith",
"observations": [
"Speaks fluent Spanish",
"Graduated in 2019",
"Prefers morning meetings"
]
}
create_entities
entities
(array of objects)
name
(string): Entity identifierentityType
(string): Type classificationobservations
(string[]): Associated observationscreate_relations
relations
(array of objects)
from
(string): Source entity nameto
(string): Target entity namerelationType
(string): Relationship type in active voiceadd_observations
observations
(array of objects)
entityName
(string): Target entitycontents
(string[]): New observations to adddelete_entities
entityNames
(string[])delete_observations
deletions
(array of objects)
entityName
(string): Target entityobservations
(string[]): Observations to removedelete_relations
relations
(array of objects)
from
(string): Source entity nameto
(string): Target entity namerelationType
(string): Relationship typeread_graph
search_nodes
query
(string)open_nodes
names
(string[])Add this to your claude_desktop_config.json:
{
"mcpServers": {
"memory": {
"command": "docker",
"args": ["run", "-i", "-v", "claude-memory:/app/dist", "--rm", "mcp/memory"]
}
}
}
{
"mcpServers": {
"memory": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-memory"
]
}
}
}
The server can be configured using the following environment variables:
{
"mcpServers": {
"memory": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-memory"
],
"env": {
"MEMORY_FILE_PATH": "/path/to/custom/memory.json"
}
}
}
}
MEMORY_FILE_PATH
: Path to the memory storage JSON file (default: memory.json
in the server directory)For quick installation, use one of the one-click installation buttons below:
For manual installation, add the following JSON block to your User Settings (JSON) file in VS Code. You can do this by pressing Ctrl + Shift + P
and typing Preferences: Open Settings (JSON)
.
Optionally, you can add it to a file called .vscode/mcp.json
in your workspace. This will allow you to share the configuration with others.
Note that the
mcp
key is not needed in the.vscode/mcp.json
file.
{
"mcp": {
"servers": {
"memory": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-memory"
]
}
}
}
}
{
"mcp": {
"servers": {
"memory": {
"command": "docker",
"args": [
"run",
"-i",
"-v",
"claude-memory:/app/dist",
"--rm",
"mcp/memory"
]
}
}
}
}
The prompt for utilizing memory depends on the use case. Changing the prompt will help the model determine the frequency and types of memories created.
Here is an example prompt for chat personalization. You could use this prompt in the "Custom Instructions" field of a Claude.ai Project.
Follow these steps for each interaction:
1. User Identification:
- You should assume that you are interacting with default_user
- If you have not identified default_user, proactively try to do so.
2. Memory Retrieval:
- Always begin your chat by saying only "Remembering..." and retrieve all relevant information from your knowledge graph
- Always refer to your knowledge graph as your "memory"
3. Memory
- While conversing with the user, be attentive to any new information that falls into these categories:
a) Basic Identity (age, gender, location, job title, education level, etc.)
b) Behaviors (interests, habits, etc.)
c) Preferences (communication style, preferred language, etc.)
d) Goals (goals, targets, aspirations, etc.)
e) Relationships (personal and professional relationships up to 3 degrees of separation)
4. Memory Update:
- If any new information was gathered during the interaction, update your memory as follows:
a) Create entities for recurring organizations, people, and significant events
b) Connect them to the current entities using relations
c) Store facts about them as observations
Docker:
docker build -t mcp/memory -f src/memory/Dockerfile .
This MCP server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "memory": { "command": "npx", "args": [ "-y", "@modelcontextprotocol/server-memory" ] } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by topoteretes
Provides dynamic memory for AI agents through modular ECL (Extract, Cognify, Load) pipelines, enabling seamless integration with graph and vector stores using minimal code.
by basicmachines-co
Enables persistent, local‑first knowledge management by allowing LLMs to read and write Markdown files during natural conversations, building a traversable knowledge graph that stays under the user’s control.
by smithery-ai
Provides read and search capabilities for Markdown notes in an Obsidian vault for Claude Desktop and other MCP clients.
by chatmcp
Summarize chat messages by querying a local chat database and returning concise overviews.
by dmayboroda
Provides on‑premises conversational retrieval‑augmented generation (RAG) with configurable Docker containers, supporting fully local execution, ChatGPT‑based custom GPTs, and Anthropic Claude integration.
by GreatScottyMac
Provides a project‑specific memory bank that stores decisions, progress, architecture, and custom data, exposing a structured knowledge graph via MCP for AI assistants and IDE tools.
by andrea9293
Provides document management and AI-powered semantic search for storing, retrieving, and querying text, markdown, and PDF files locally without external databases.
by scorzeth
Provides a local MCP server that interfaces with a running Anki instance to retrieve, create, and update flashcards through standard MCP calls.
by sirmews
Read and write records in a Pinecone vector index via Model Context Protocol, enabling semantic search and document management for Claude Desktop.