by basicmachines-co
Enables persistent, local‑first knowledge management by allowing LLMs to read and write Markdown files during natural conversations, building a traversable knowledge graph that stays under the user’s control.
Basic Memory lets you create a personal knowledge base that grows automatically through chat with large language models. Notes are stored as plain Markdown files on your computer, indexed in a local SQLite database, and linked together to form a semantic graph.
uv
: uv tool install basic-memory
.uvx basic-memory mcp
.basic-memory mcp
) and optionally enable realtime sync with basic-memory sync --watch
.create a note
, search
, read note
, or build context
and the assistant will read/write the Markdown files directly.basic-memory sync
, basic-memory import
, and canvas generation tools.Q: Do I need an internet connection? A: Only when the LLM itself requires it. All knowledge files are stored locally.
Q: Can I use models other than Claude? A: Yes, any MCP‑compatible LLM can interact with the server.
Q: How are relations represented?
A: Using simple markdown lines like - relates_to [[Another Note]]
which the system parses into graph edges.
Q: Is any cloud service involved? A: No. The only external component is the LLM API you choose to call.
Q: How do I back up my data?
A: Backup the directory (default ~/basic-memory
) and the SQLite index file.
Basic Memory lets you build persistent knowledge through natural conversations with Large Language Models (LLMs) like Claude, while keeping everything in simple Markdown files on your computer. It uses the Model Context Protocol (MCP) to enable any compatible LLM to read and write to your local knowledge base.
https://github.com/user-attachments/assets/a55d8238-8dd0-454a-be4c-8860dbbd0ddc
# Install with uv (recommended)
uv tool install basic-memory
# Configure Claude Desktop (edit ~/Library/Application Support/Claude/claude_desktop_config.json)
# Add this to your config:
{
"mcpServers": {
"basic-memory": {
"command": "uvx",
"args": [
"basic-memory",
"mcp"
]
}
}
}
# Now in Claude Desktop, you can:
# - Write notes with "Create a note about coffee brewing methods"
# - Read notes with "What do I know about pour over coffee?"
# - Search with "Find information about Ethiopian beans"
You can view shared context via files in ~/basic-memory
(default directory location).
You can use Smithery to automatically configure Basic Memory for Claude Desktop:
npx -y @smithery/cli install @basicmachines-co/basic-memory --client claude
This installs and configures Basic Memory without requiring manual edits to the Claude Desktop configuration file. The Smithery server hosts the MCP server component, while your data remains stored locally as Markdown files.
Most LLM interactions are ephemeral - you ask a question, get an answer, and everything is forgotten. Each conversation starts fresh, without the context or knowledge from previous ones. Current workarounds have limitations:
Basic Memory addresses these problems with a simple approach: structured Markdown files that both humans and LLMs can read and write to. The key advantages:
With Basic Memory, you can:
Let's say you're exploring coffee brewing methods and want to capture your knowledge. Here's how it works:
I've been experimenting with different coffee brewing methods. Key things I've learned:
- Pour over gives more clarity in flavor than French press
- Water temperature is critical - around 205°F seems best
- Freshly ground beans make a huge difference
... continue conversation.
"Let's write a note about coffee brewing methods."
LLM creates a new Markdown file on your system (which you can see instantly in Obsidian or your editor):
---
title: Coffee Brewing Methods
permalink: coffee-brewing-methods
tags:
- coffee
- brewing
---
# Coffee Brewing Methods
## Observations
- [method] Pour over provides more clarity and highlights subtle flavors
- [technique] Water temperature at 205°F (96°C) extracts optimal compounds
- [principle] Freshly ground beans preserve aromatics and flavor
## Relations
- relates_to [[Coffee Bean Origins]]
- requires [[Proper Grinding Technique]]
- affects [[Flavor Extraction]]
The note embeds semantic content and links to other topics via simple Markdown formatting.
~/$HOME/basic-memory
).basic-memory sync --watch
Look at `coffee-brewing-methods` for context about pour over coffee
The LLM can now build rich context from the knowledge graph. For example:
Following relation 'relates_to [[Coffee Bean Origins]]':
- Found information about Ethiopian Yirgacheffe
- Notes on Colombian beans' nutty profile
- Altitude effects on bean characteristics
Following relation 'requires [[Proper Grinding Technique]]':
- Burr vs. blade grinder comparisons
- Grind size recommendations for different methods
- Impact of consistent particle size on extraction
Each related document can lead to more context, building a rich semantic understanding of your knowledge base.
This creates a two-way flow where:
Under the hood, Basic Memory:
Entity
objectsEntity
can have Observations
, or facts associated with itRelations
connect entities together to form the knowledge graphThe file format is just Markdown with some simple markup:
Each Markdown file has:
title: <Entity title>
type: <The type of Entity> (e.g. note)
permalink: <a uri slug>
- <optional metadata> (such as tags)
Observations are facts about a topic.
They can be added by creating a Markdown list with a special format that can reference a category
, tags
using a
"#" character, and an optional context
.
Observation Markdown format:
- [category] content #tag (optional context)
Examples of observations:
- [method] Pour over extracts more floral notes than French press
- [tip] Grind size should be medium-fine for pour over #brewing
- [preference] Ethiopian beans have bright, fruity flavors (especially from Yirgacheffe)
- [fact] Lighter roasts generally contain more caffeine than dark roasts
- [experiment] Tried 1:15 coffee-to-water ratio with good results
- [resource] James Hoffman's V60 technique on YouTube is excellent
- [question] Does water temperature affect extraction of different compounds differently?
- [note] My favorite local shop uses a 30-second bloom time
Relations are links to other topics. They define how entities connect in the knowledge graph.
Markdown format:
- relation_type [[WikiLink]] (optional context)
Examples of relations:
- pairs_well_with [[Chocolate Desserts]]
- grown_in [[Ethiopia]]
- contrasts_with [[Tea Brewing Methods]]
- requires [[Burr Grinder]]
- improves_with [[Fresh Beans]]
- relates_to [[Morning Routine]]
- inspired_by [[Japanese Coffee Culture]]
- documented_in [[Coffee Journal]]
Add the following JSON block to your User Settings (JSON) file in VS Code. You can do this by pressing Ctrl + Shift + P
and typing Preferences: Open User Settings (JSON)
.
{
"mcp": {
"servers": {
"basic-memory": {
"command": "uvx",
"args": ["basic-memory", "mcp"]
}
}
}
}
Optionally, you can add it to a file called .vscode/mcp.json
in your workspace. This will allow you to share the configuration with others.
{
"servers": {
"basic-memory": {
"command": "uvx",
"args": ["basic-memory", "mcp"]
}
}
}
You can use Basic Memory with VS Code to easily retrieve and store information while coding.
Basic Memory is built using the MCP (Model Context Protocol) and works with the Claude desktop app (https://claude.ai/):
Edit your MCP configuration file (usually located at ~/Library/Application Support/Claude/claude_desktop_config.json
for OS X):
{
"mcpServers": {
"basic-memory": {
"command": "uvx",
"args": [
"basic-memory",
"mcp"
]
}
}
}
If you want to use a specific project (see Multiple Projects below), update your Claude Desktop config:
{
"mcpServers": {
"basic-memory": {
"command": "uvx",
"args": [
"basic-memory",
"mcp",
"--project",
"your-project-name"
]
}
}
}
# One-time sync of local knowledge updates
basic-memory sync
# Run realtime sync process (recommended)
basic-memory sync --watch
write_note(title, content, folder, tags) - Create or update notes
read_note(identifier, page, page_size) - Read notes by title or permalink
build_context(url, depth, timeframe) - Navigate knowledge graph via memory:// URLs
search(query, page, page_size) - Search across your knowledge base
recent_activity(type, depth, timeframe) - Find recently updated information
canvas(nodes, edges, title, folder) - Generate knowledge visualizations
"Create a note about our project architecture decisions"
"Find information about JWT authentication in my notes"
"Create a canvas visualization of my project components"
"Read my notes on the authentication system"
"What have I been working on in the past week?"
See the Documentation for more info, including:
AGPL-3.0
Contributions are welcome. See the Contributing guide for info about setting up the project locally and submitting PRs.
Built with ♥️ by Basic Machines
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "basic-memory": { "command": "uvx", "args": [ "basic-memory", "mcp" ], "env": {} } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
A basic implementation of persistent memory using a local knowledge graph. This lets Claude remember information about the user across chats.
by topoteretes
Provides dynamic memory for AI agents through modular ECL (Extract, Cognify, Load) pipelines, enabling seamless integration with graph and vector stores using minimal code.
by smithery-ai
Provides read and search capabilities for Markdown notes in an Obsidian vault for Claude Desktop and other MCP clients.
by chatmcp
Summarize chat messages by querying a local chat database and returning concise overviews.
by dmayboroda
Provides on‑premises conversational retrieval‑augmented generation (RAG) with configurable Docker containers, supporting fully local execution, ChatGPT‑based custom GPTs, and Anthropic Claude integration.
by GreatScottyMac
Provides a project‑specific memory bank that stores decisions, progress, architecture, and custom data, exposing a structured knowledge graph via MCP for AI assistants and IDE tools.
by andrea9293
Provides document management and AI-powered semantic search for storing, retrieving, and querying text, markdown, and PDF files locally without external databases.
by scorzeth
Provides a local MCP server that interfaces with a running Anki instance to retrieve, create, and update flashcards through standard MCP calls.
by sirmews
Read and write records in a Pinecone vector index via Model Context Protocol, enabling semantic search and document management for Claude Desktop.