by StitchAI
Provides tools for creating, retrieving, and managing AI agent memories through a Model Context Protocol server.
A lightweight Node.js server that implements the Model Context Protocol to manage AI agent memories. It lets developers create dedicated memory spaces, upload and retrieve individual memories, and query collections of memories with filtering and pagination.
git clone https://github.com/StitchAI/stitch-ai-mcp.git
npm install @modelcontextprotocol/sdk zod
npm install -D @types/node typescript
npm run start
claude_desktop_config.json
that points to the server script using npx ts-node
(see the serverConfig
section below).create_space
, delete_space
, get_all_spaces
.upload_memory
, get_memory
, get_all_memories
with optional name filtering, limit, and offset.npm run start
.Q: Do I need an API key?
A: Yes, the server expects an API_KEY
environment variable for authentication with Stitch AI’s backend services.
Q: Can I run the server on a different port? A: The repository uses the default port defined in the source code; you can change it by modifying the server configuration or environment variables.
Q: Is there a Docker image? A: The README does not provide a Docker setup, but the Node.js app can be containerized using a standard Node base image.
Q: How do I filter memories by name?
A: Use the optional memory_names
query parameter (comma‑separated) in the get_all_memories
tool.
Q: What is the default pagination?
A: limit
defaults to 50 and offset
defaults to 0 if not supplied.
Decentralized Knowledge Hub for AI
This repository contains a Model Context Protocol (MCP) server implementation for Stitch AI's memory management system. The server provides tools for creating, retrieving, and managing AI agent memories.
The MCP server provides the following tools:
create_space
Creates a new memory space with the specified name.
space_name
: The name of the memory space to createtype
: The type of memory space to createdelete_space
Deletes a memory space with the specified name.
space_name
: The name of the memory space to deleteget_all_spaces
Gets a list of all available memory spaces.
upload_memory
Uploads a new memory to a specified memory space.
space
: The name of the memory space to upload tomessage
: The memory message to uploadmemory
: The memory content to uploadget_memory
Retrieves a specific memory by ID from a memory space.
space
: The name of the memory spacememory_id
: The ID of the memory to retrieveget_all_memories
Retrieves all memories from a specified memory space.
space
: The name of the memory space to retrieve memories frommemory_names
: Comma-separated list of memory names to filterlimit
: Maximum number of memories to return (default: 50)offset
: Number of memories to skip (default: 0)npm run start
Clone the repository
git clone https://github.com/StitchAI/stitch-ai-mcp.git
Install dependencies
npm install @modelcontextprotocol/sdk zod
npm install -D @types/node typescript
Install Claude for Desktop
Configure Claude for Desktop
~/Library/Application Support/Claude/claude_desktop_config.json
%AppData%\Claude\claude_desktop_config.json
Edit Configuration File
code ~/Library/Application\ Support/Claude/claude_desktop_config.json
code $env:AppData\Claude\claude_desktop_config.json
{
"mcpServers": {
"stitchai": {
"command": "npx",
"args": [
"ts-node",
"/path/to/cloned/stitch-ai-mcp/src/server.ts"
],
"env": {
"API_KEY": "<STITCH_AI_API_KEY>",
"BASE_URL": "https://api-demo.stitch-ai.co"
}
}
}
}
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "stitchai": { "command": "npx", "args": [ "ts-node", "/path/to/cloned/stitch-ai-mcp/src/server.ts" ], "env": { "API_KEY": "<STITCH_AI_API_KEY>", "BASE_URL": "https://api-demo.stitch-ai.co" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
A basic implementation of persistent memory using a local knowledge graph. This lets Claude remember information about the user across chats.
by topoteretes
Provides dynamic memory for AI agents through modular ECL (Extract, Cognify, Load) pipelines, enabling seamless integration with graph and vector stores using minimal code.
by basicmachines-co
Enables persistent, local‑first knowledge management by allowing LLMs to read and write Markdown files during natural conversations, building a traversable knowledge graph that stays under the user’s control.
by smithery-ai
Provides read and search capabilities for Markdown notes in an Obsidian vault for Claude Desktop and other MCP clients.
by chatmcp
Summarize chat messages by querying a local chat database and returning concise overviews.
by dmayboroda
Provides on‑premises conversational retrieval‑augmented generation (RAG) with configurable Docker containers, supporting fully local execution, ChatGPT‑based custom GPTs, and Anthropic Claude integration.
by GreatScottyMac
Provides a project‑specific memory bank that stores decisions, progress, architecture, and custom data, exposing a structured knowledge graph via MCP for AI assistants and IDE tools.
by andrea9293
Provides document management and AI-powered semantic search for storing, retrieving, and querying text, markdown, and PDF files locally without external databases.
by scorzeth
Provides a local MCP server that interfaces with a running Anki instance to retrieve, create, and update flashcards through standard MCP calls.