by doobidoo
Provides a universal memory service with semantic search, intelligent memory triggers, OAuth‑enabled team collaboration, and multi‑client support for Claude Desktop, Claude Code, VS Code, Cursor and over a dozen AI applications.
MCP Memory Service offers a persistent, searchable knowledge store that AI assistants can read from and write to automatically. It integrates with the Model Context Protocol to supply relevant memories during a chat, supports natural‑language time and tag queries, and can consolidate memories across sessions.
git clone https://github.com/doobidoo/mcp-memory-service.git
cd mcp-memory-service
python install.py # default lightweight SQLite‑vec backend
# optional ML stack
# python install.py --with-ml
# optional ChromaDB backend for multi‑client sharing
# python install.py --with-chromadb
uv run memory server # or: python -m mcp_memory_service.server
export MCP_OAUTH_ENABLED=true
uv run memory server --http # starts HTTP API on 8000
claude mcp add --transport http memory-service http://localhost:8000/mcp
The client will auto‑discover OAuth endpoints and obtain a token.uv run memory store "Fixed authentication race condition"
uv run memory recall "authentication race condition"
uv run memory search --tags python debugging
uv run memory health # shows server health and OAuth status
Q: Do I need heavy ML libraries?
A: No. The default installation uses SQLite‑vec with ONNX embeddings (<100 MB). Add --with-ml only if you need advanced models.
Q: My Python version is 3.13 – will it work? A: SQLite‑vec wheels may be missing for 3.13. The installer falls back to building from source or you can switch to the ChromaDB backend.
Q: How does OAuth work for Claude Code?
A: Enable MCP_OAUTH_ENABLED=true and start the server with --http. Clients discover the /oauth/.well-known endpoint, register themselves, and receive JWTs for each request.
Q: Can I run the service in Docker?
A: Yes. Use the provided docker-compose.yml for MCP protocol or docker-compose.http.yml for the HTTP/OAuth API.
Q: What storage backend should I choose? A: Use SQLite‑vec for single‑user or lightweight scenarios. Choose ChromaDB when multiple users need to share the same memory store.
Universal MCP memory service with intelligent memory triggers, OAuth 2.1 team collaboration, and semantic memory search for AI assistants. Features Natural Memory Triggers v7.1.0 with 85%+ trigger accuracy, Claude Code HTTP transport, zero-configuration authentication, and enterprise security. Works with Claude Desktop, VS Code, Cursor, Continue, and 13+ AI applications with SQLite-vec for fast local search and Cloudflare for global distribution.
🤖 Intelligent Memory Awareness (Zero Configuration):
# 1. Install MCP Memory Service
git clone https://github.com/doobidoo/mcp-memory-service.git
cd mcp-memory-service && python install.py
# 2. Install Natural Memory Triggers
cd claude-hooks && python install_hooks.py --natural-triggers
# 3. Test intelligent triggers
node memory-mode-controller.js status
# ✅ Done! Claude Code now automatically detects when you need memory context
📖 Complete Guide: Natural Memory Triggers v7.1.0
🔗 Claude Code Team Collaboration (Zero Configuration):
# 1. Start OAuth-enabled server
export MCP_OAUTH_ENABLED=true
uv run memory server --http
# 2. Add HTTP transport to Claude Code
claude mcp add --transport http memory-service http://localhost:8000/mcp
# ✅ Done! Claude Code automatically handles OAuth registration and team collaboration
📖 Complete Setup Guide: OAuth 2.1 Setup Guide
Universal Installer (Most Compatible):
# Clone and install with automatic platform detection
git clone https://github.com/doobidoo/mcp-memory-service.git
cd mcp-memory-service
# Lightweight installation (SQLite-vec with ONNX embeddings - recommended)
python install.py
# Add full ML capabilities (torch + sentence-transformers for advanced features)
python install.py --with-ml
# Add ChromaDB backend support (includes full ML stack - for multi-client setups)
python install.py --with-chromadb
📝 Installation Options Explained:
--with-ml: Adds PyTorch + sentence-transformers for advanced ML features - heavier but more capable--with-chromadb: Multi-client local server support - use only if you need shared team accessDocker (Fastest):
# For MCP protocol (Claude Desktop)
docker-compose up -d
# For HTTP API + OAuth (Team Collaboration)
docker-compose -f docker-compose.http.yml up -d
Smithery (Claude Desktop):
# Auto-install for Claude Desktop
npx -y @smithery/cli install @doobidoo/mcp-memory-service --client claude
Updating from an older version? Scripts have been reorganized for better maintainability:
python -m mcp_memory_service.server in your Claude Desktop config (no path dependencies!)uv run memory server with UV toolingscripts/run_memory_server.py to scripts/server/run_memory_server.pyOn your first run, you'll see some warnings that are completely normal:
These warnings disappear after the first successful run. The service is working correctly! For details, see our First-Time Setup Guide.
sqlite-vec may not have pre-built wheels for Python 3.13 yet. If installation fails:
brew install python@3.12--storage-backend chromadb --with-chromadbmacOS users may encounter enable_load_extension errors with sqlite-vec:
brew install python && rehashPYTHON_CONFIGURE_OPTS='--enable-loadable-sqlite-extensions' pyenv install 3.12.0--with-chromadb👉 Visit our comprehensive Wiki for detailed guides:
Note: All heavy ML dependencies (PyTorch, sentence-transformers, ChromaDB) are now optional to dramatically reduce build times and image sizes. SQLite-vec uses lightweight ONNX embeddings by default. Install with
--with-mlfor full ML capabilities or--with-chromadbfor multi-client features.
# Start OAuth-enabled server for team collaboration
export MCP_OAUTH_ENABLED=true
uv run memory server --http
# Claude Code team members connect via HTTP transport
claude mcp add --transport http memory-service http://your-server:8000/mcp
# → Automatic OAuth discovery, registration, and authentication
# Store a memory
uv run memory store "Fixed race condition in authentication by adding mutex locks"
# Search for relevant memories
uv run memory recall "authentication race condition"
# Search by tags
uv run memory search --tags python debugging
# Check system health (shows OAuth status)
uv run memory health
Recommended approach - Add to your Claude Desktop config (~/.claude/config.json):
{
"mcpServers": {
"memory": {
"command": "python",
"args": ["-m", "mcp_memory_service.server"],
"env": {
"MCP_MEMORY_STORAGE_BACKEND": "sqlite_vec"
}
}
}
}
Alternative approaches:
// Option 1: UV tooling (if using UV)
{
"mcpServers": {
"memory": {
"command": "uv",
"args": ["--directory", "/path/to/mcp-memory-service", "run", "memory", "server"],
"env": {
"MCP_MEMORY_STORAGE_BACKEND": "sqlite_vec"
}
}
}
}
// Option 2: Direct script path (v6.17.0+)
{
"mcpServers": {
"memory": {
"command": "python",
"args": ["/path/to/mcp-memory-service/scripts/server/run_memory_server.py"],
"env": {
"MCP_MEMORY_STORAGE_BACKEND": "sqlite_vec"
}
}
}
}
# Storage backend (sqlite_vec recommended)
export MCP_MEMORY_STORAGE_BACKEND=sqlite_vec
# Enable HTTP API
export MCP_HTTP_ENABLED=true
export MCP_HTTP_PORT=8000
# Security
export MCP_API_KEY="your-secure-key"
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ AI Clients │ │ MCP Memory │ │ Storage Backend │
│ │ │ Service v7.0 │ │ │
│ • Claude Desktop│◄──►│ • MCP Protocol │◄──►│ • SQLite-vec │
│ • Claude Code │ │ • HTTP Transport│ │ • ChromaDB │
│ (HTTP/OAuth) │ │ • OAuth 2.1 Auth│ │ • Cloudflare │
│ • VS Code │ │ • Memory Store │ │ • Hybrid │
│ • Cursor │ │ • Semantic │ │ │
│ • 13+ AI Apps │ │ Search │ │ │
└─────────────────┘ └─────────────────┘ └─────────────────┘
mcp-memory-service/
├── src/mcp_memory_service/ # Core application
│ ├── models/ # Data models
│ ├── storage/ # Storage backends
│ ├── web/ # HTTP API & dashboard
│ └── server.py # MCP server
├── scripts/ # Utilities & installation
├── tests/ # Test suite
└── tools/docker/ # Docker configuration
See CONTRIBUTING.md for detailed guidelines.
python scripts/validation/validate_configuration_complete.py to check your setupReal-world metrics from active deployments:
Apache License 2.0 - see LICENSE for details.
Ready to supercharge your AI workflow? 🚀
👉 Start with our Installation Guide or explore the Wiki for comprehensive documentation.
Transform your AI conversations into persistent, searchable knowledge that grows with you.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
A basic implementation of persistent memory using a local knowledge graph. This lets Claude remember information about the user across chats.
by topoteretes
Provides dynamic memory for AI agents through modular ECL (Extract, Cognify, Load) pipelines, enabling seamless integration with graph and vector stores using minimal code.
by basicmachines-co
Enables persistent, local‑first knowledge management by allowing LLMs to read and write Markdown files during natural conversations, building a traversable knowledge graph that stays under the user’s control.
by smithery-ai
Provides read and search capabilities for Markdown notes in an Obsidian vault for Claude Desktop and other MCP clients.
by chatmcp
Summarize chat messages by querying a local chat database and returning concise overviews.
by dmayboroda
Provides on‑premises conversational retrieval‑augmented generation (RAG) with configurable Docker containers, supporting fully local execution, ChatGPT‑based custom GPTs, and Anthropic Claude integration.
by qdrant
Provides a Model Context Protocol server that stores and retrieves semantic memories using Qdrant vector search, acting as a semantic memory layer.
by GreatScottyMac
Provides a project‑specific memory bank that stores decisions, progress, architecture, and custom data, exposing a structured knowledge graph via MCP for AI assistants and IDE tools.
by andrea9293
Provides document management and AI-powered semantic search for storing, retrieving, and querying text, markdown, and PDF files locally without external databases.