by contextstream
Provides persistent memory, semantic code search, and graph‑based analysis for AI tools through an MCP server.
ContextStream MCP Server enables AI assistants to retain decisions, preferences, and lessons across sessions, and to query codebases with semantic search and graph analysis without requiring explicit tool names.
npx -y @contextstream/mcp-server (or install globally with npm install -g @contextstream/mcp-server).CONTEXTSTREAM_API_URL and either CONTEXTSTREAM_API_KEY or CONTEXTSTREAM_JWT.npx -y @contextstream/mcp-server (or contextstream-mcp if installed globally).session_init, context_smart, session_capture, projects_ingest_local, graph_ingest, etc., to load context, search code, capture memories, and analyze dependencies.session_init, context_smart).graph_ingest) and local project indexing (projects_ingest_local).Q: Do I need to write special commands for the AI tool? A: No. The AI simply describes the desired action (e.g., “show me the payment code”) and the server selects the appropriate tool.
Q: Which environment variables are mandatory?
A: CONTEXTSTREAM_API_URL and either CONTEXTSTREAM_API_KEY or CONTEXTSTREAM_JWT.
Q: How do I limit the number of exposed tools?
A: Set CONTEXTSTREAM_TOOLSET=light or light in the server env, or use CONTEXTSTREAM_TOOL_ALLOWLIST to specify exact tools.
Q: Can I write ingested files to disk for testing?
A: Yes, by setting the server‑side QA_FILE_WRITE_ROOT and passing write_to_disk:true to projects_ingest_local.
Q: How are PRO tools handled for free users?
A: Calls to PRO tools return an upgrade message with a configurable link (CONTEXTSTREAM_UPGRADE_URL).
Persistent memory, semantic search, and code intelligence for any MCP-compatible AI tool.
ContextStream is a shared "brain" for your AI workflows. It stores decisions, preferences, and lessons, and lets your AI tools search and analyze your codebase with consistent context across sessions.
You don't need to memorize tool names. Just describe what you want and your AI uses the right ContextStream tools automatically:
| You say... | ContextStream does... |
|---|---|
| "session summary" | Gets a summary of your workspace context |
| "what did we decide about auth?" | Recalls past decisions about authentication |
| "remember we're using PostgreSQL" | Saves this to memory for future sessions |
| "search for payment code" | Searches your codebase semantically |
| "what depends on UserService?" | Analyzes code dependencies |
No special syntax. No commands to learn. Just ask.
Tip: For best results, add the recommended editor rules so your AI consistently calls
session_init/context_smartautomatically.

session_init, context_smart)graph_ingest)projects_ingest_local)graph_ingestThis interactive wizard sets up authentication, installs editor rules, and writes MCP config files for the tools you select.
npx -y @contextstream/mcp-server setup
Notes:
light (~30 tools), standard (default, ~50 tools), or complete (~86 tools including workspaces, projects, search, memory, graph, AI, and integrations).~/.contextstream/credentials.json (and also writes it into the MCP config files it generates). Delete that file to force a fresh login.~/.codex/config.toml), so the wizard will always write Codex config globally when selected.npx -y @contextstream/mcp-server setup --dry-runRun directly (recommended for MCP configs):
npx -y @contextstream/mcp-server
Or install globally:
npm install -g @contextstream/mcp-server
contextstream-mcp
To get the latest features and bug fixes, update periodically:
npm update -g @contextstream/mcp-server
The MCP server will warn you when a newer version is available. After updating, restart your AI tool to use the new version.
If you ran the setup wizard, you can usually skip this section.
If you prefer to configure things by hand (or your tool can't be auto-configured), add the ContextStream MCP server to your client using one of the examples below.
Toolset: By default, the server exposes the standard toolset (~50 tools). Use CONTEXTSTREAM_TOOLSET=light to reduce tool count (~30 tools), or CONTEXTSTREAM_TOOLSET=complete to expose all ~86 tools (workspaces, projects, search, memory, graph, AI, integrations). See the full tool catalog.
These clients use the mcpServers JSON schema:
~/.cursor/mcp.json (global) or .cursor/mcp.json (project)~/.codeium/windsurf/mcp_config.json~/Library/Application Support/Claude/claude_desktop_config.json%APPDATA%\\Claude\\claude_desktop_config.jsonMany other MCP JSON clients also use this same mcpServers shape (including Claude Code project scope via .mcp.json).
Standard toolset (default, ~50 tools):
{
"mcpServers": {
"contextstream": {
"command": "npx",
"args": ["-y", "@contextstream/mcp-server"],
"env": {
"CONTEXTSTREAM_API_URL": "https://api.contextstream.io",
"CONTEXTSTREAM_API_KEY": "your_api_key"
}
}
}
}
Complete toolset (~86 tools):
{
"mcpServers": {
"contextstream": {
"command": "npx",
"args": ["-y", "@contextstream/mcp-server"],
"env": {
"CONTEXTSTREAM_API_URL": "https://api.contextstream.io",
"CONTEXTSTREAM_API_KEY": "your_api_key",
"CONTEXTSTREAM_TOOLSET": "complete"
}
}
}
}
.vscode/mcp.json)VS Code uses a different schema with a top-level servers map:
Core toolset (default):
{
"servers": {
"contextstream": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@contextstream/mcp-server"],
"env": {
"CONTEXTSTREAM_API_URL": "https://api.contextstream.io",
"CONTEXTSTREAM_API_KEY": "your_api_key"
}
}
}
}
Complete toolset (~86 tools):
{
"servers": {
"contextstream": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@contextstream/mcp-server"],
"env": {
"CONTEXTSTREAM_API_URL": "https://api.contextstream.io",
"CONTEXTSTREAM_API_KEY": "your_api_key",
"CONTEXTSTREAM_TOOLSET": "complete"
}
}
}
}
Strong recommendation: VS Code supports inputs so you don’t have to hardcode secrets in a committed file:
{
"servers": {
"contextstream": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@contextstream/mcp-server"],
"env": {
"CONTEXTSTREAM_API_URL": "https://api.contextstream.io",
"CONTEXTSTREAM_API_KEY": "${input:contextstreamApiKey}"
}
}
},
"inputs": [
{
"id": "contextstreamApiKey",
"type": "promptString",
"description": "ContextStream API Key",
"password": true
}
]
}
User scope (all projects):
Standard toolset (default):
claude mcp add --transport stdio contextstream --scope user \
--env CONTEXTSTREAM_API_URL=https://api.contextstream.io \
--env CONTEXTSTREAM_API_KEY=YOUR_KEY \
-- npx -y @contextstream/mcp-server
Complete toolset (~86 tools):
claude mcp add --transport stdio contextstream --scope user \
--env CONTEXTSTREAM_API_URL=https://api.contextstream.io \
--env CONTEXTSTREAM_API_KEY=YOUR_KEY \
--env CONTEXTSTREAM_TOOLSET=complete \
-- npx -y @contextstream/mcp-server
Note: Claude Code may warn about large tool contexts when using complete. The default is standard (~50 tools). Use light for fewer tools.
Windows caveat (native Windows, not WSL): if npx isn't found, use cmd /c npx -y @contextstream/mcp-server after --.
Alternative (JSON form):
Standard:
claude mcp add-json contextstream \
'{"type":"stdio","command":"npx","args":["-y","@contextstream/mcp-server"],"env":{"CONTEXTSTREAM_API_URL":"https://api.contextstream.io","CONTEXTSTREAM_API_KEY":"your_api_key"}}'
Complete:
claude mcp add-json contextstream \
'{"type":"stdio","command":"npx","args":["-y","@contextstream/mcp-server"],"env":{"CONTEXTSTREAM_API_URL":"https://api.contextstream.io","CONTEXTSTREAM_API_KEY":"your_api_key","CONTEXTSTREAM_TOOLSET":"complete"}}'
~/.codex/config.toml)Standard toolset (default):
[mcp_servers.contextstream]
command = "npx"
args = ["-y", "@contextstream/mcp-server"]
[mcp_servers.contextstream.env]
CONTEXTSTREAM_API_URL = "https://api.contextstream.io"
CONTEXTSTREAM_API_KEY = "your_api_key"
Complete toolset (~86 tools):
[mcp_servers.contextstream]
command = "npx"
args = ["-y", "@contextstream/mcp-server"]
[mcp_servers.contextstream.env]
CONTEXTSTREAM_API_URL = "https://api.contextstream.io"
CONTEXTSTREAM_API_KEY = "your_api_key"
CONTEXTSTREAM_TOOLSET = "complete"
After editing, restart your MCP client so it reloads the server configuration.
You can authenticate using either:
CONTEXTSTREAM_API_KEY (recommended for local/dev)CONTEXTSTREAM_JWT (useful for hosted or user-session flows)| Variable | Required | Description |
|---|---|---|
CONTEXTSTREAM_API_URL |
Yes | Base API URL (e.g. https://api.contextstream.io) |
CONTEXTSTREAM_API_KEY |
Yes* | API key (*required unless CONTEXTSTREAM_JWT is set) |
CONTEXTSTREAM_JWT |
Yes* | JWT (*required unless CONTEXTSTREAM_API_KEY is set) |
CONTEXTSTREAM_WORKSPACE_ID |
No | Default workspace ID fallback |
CONTEXTSTREAM_PROJECT_ID |
No | Default project ID fallback |
CONTEXTSTREAM_USER_AGENT |
No | Custom user agent string |
CONTEXTSTREAM_TOOLSET |
No | Tool bundle to expose: light (~30 tools), standard (default, ~50 tools), or complete (~86 tools). Claude Code/Desktop may warn about large tool contexts with complete. |
CONTEXTSTREAM_TOOL_ALLOWLIST |
No | Comma-separated tool names to expose (overrides toolset) |
CONTEXTSTREAM_PRO_TOOLS |
No | Comma-separated tool names treated as PRO (default: ai_context,ai_enhanced_context,ai_context_budget,ai_embeddings,ai_plan,ai_tasks) |
CONTEXTSTREAM_UPGRADE_URL |
No | Upgrade link shown when Free users call PRO tools (default: https://contextstream.io/pricing) |
The following environment variables are configured on the ContextStream API server (not in your MCP client config):
| Variable | Required | Description |
|---|---|---|
QA_FILE_WRITE_ROOT |
No | Server-side root directory for write_to_disk file writes. When set, the API allows the projects_ingest_local tool to write ingested files to disk for testing/QA purposes. Files are written under <QA_FILE_WRITE_ROOT>/<project_id>/<relative_path>. If not set, write_to_disk requests are rejected. |
projects_ingest_localThe projects_ingest_local tool accepts two optional parameters for QA/testing scenarios:
| Parameter | Type | Default | Description |
|---|---|---|---|
write_to_disk |
boolean | false |
When true, writes ingested files to disk on the API server under QA_FILE_WRITE_ROOT before indexing. Requires the API to have QA_FILE_WRITE_ROOT configured. |
overwrite |
boolean | false |
When true (and write_to_disk is enabled), allows overwriting existing files. Otherwise, existing files are skipped. |
Example usage:
{
"path": "/path/to/local/project",
"write_to_disk": true,
"overwrite": false
}
Note: The write_to_disk feature is intended for testing, QA, and development scenarios where you need to materialize files on a test server. In production, QA_FILE_WRITE_ROOT should typically be unset to disable file writes.
session_init(folder_path="...", context_hint="<first user message>")context_smart(user_message="<current user message>")session_capture(...) or session_capture_lesson(...)Most tools accept omitted workspace_id / project_id and will use the current session defaults.
session_init first (or pass the ID explicitly).workspace_associate once so the server can auto-select the right workspace for that folder.If your account has no workspaces, ContextStream will prompt your AI assistant to ask you for a workspace name.
workspace_bootstrap(workspace_name="...", folder_path="...")Tools are labeled as (Free) or (PRO) in the MCP tool list.
ai_context, ai_enhanced_context, ai_context_budget, ai_embeddings, ai_plan, ai_tasksCONTEXTSTREAM_PRO_TOOLS and the upgrade link via CONTEXTSTREAM_UPGRADE_URL.CONTEXTSTREAM_API_URL and CONTEXTSTREAM_API_KEY (or CONTEXTSTREAM_JWT).workspace_associate to map the current repo folder to the correct workspace.git clone https://github.com/contextstream/mcp-server.git
cd mcp-server
npm install
npm run dev
npm run typecheck
npm run build
MIT
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
{
"mcpServers": {
"contextstream": {
"command": "npx",
"args": [
"-y",
"@contextstream/mcp-server"
],
"env": {
"CONTEXTSTREAM_API_URL": "https://api.contextstream.io",
"CONTEXTSTREAM_API_KEY": "<YOUR_API_KEY>"
}
}
}
}claude mcp add contextstream npx -y @contextstream/mcp-server