by modelcontextprotocol
Model Context Protocol Servers
The servers project provides an MCP server that exercises all features of the MCP protocol. It is designed as a test server for developers building MCP clients, rather than a production-ready server. It implements various functionalities like prompts, tools, resources, and sampling to demonstrate MCP capabilities.
You can use the servers project in several ways:
1. With Claude Desktop:
Add the following configuration to your claude_desktop_json
:
{
"mcpServers": {
"everything": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-everything"
]
}
}
}
2. With VS Code:
You can use one-click install buttons provided in the README for VS Code or VS Code Insiders, which can be installed via NPX or Docker. Alternatively, you can manually add a JSON block to your VS Code User Settings (JSON) or a .vscode/mcp.json
file in your workspace.
{
"mcp": {
"servers": {
"everything": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-everything"]
}
}
}
}
3. Running from source:
cd src/everything
npm install
npm run start:sse
cd src/everything
npm install
npm run start:streamableHttp
4. Running as an installed package:
npm install -g @modelcontextprotocol/server-everything@latest
npx @modelcontextprotocol/server-everything
npx @modelcontextprotocol/server-everything stdio
npx @modelcontextprotocol/server-everything sse
npx @modelcontextprotocol/server-everything streamableHttp
The servers
project includes the following key components:
echo
: Echoes back input messages.add
: Adds two numbers.longRunningOperation
: Simulates a long operation with progress notifications.sampleLLM
: Demonstrates LLM sampling.getTinyImage
: Returns a small test image.printEnv
: Prints environment variables.annotatedMessage
: Demonstrates message annotations with priority and audience settings, optionally including an image.getResourceReference
: Returns a resource reference for MCP clients.startElicitation
: Initiates an elicitation interaction with user input.test://static/resource/{id}
.simple_prompt
: Basic prompt without arguments.complex_prompt
: Advanced prompt with arguments and multi-turn conversation support.resource_prompt
: Demonstrates embedding resource references in prompts.The project can be installed globally via npm (npm install -g @modelcontextprotocol/server-everything@latest
) or run using npx
. Various commands are available to start the server with different transport protocols (stdio, sse, streamableHttp).
npx @modelcontextprotocol/server-everything
npx @modelcontextprotocol/server-everything sse
cd src/everything
npm install
npm run start:streamableHttp
This MCP server attempts to exercise all the features of the MCP protocol. It is not intended to be a useful server, but rather a test server for builders of MCP clients. It implements prompts, tools, resources, sampling, and more to showcase MCP capabilities.
echo
message
(string): Message to echo backadd
a
(number): First numberb
(number): Second numberlongRunningOperation
duration
(number, default: 10): Duration in secondssteps
(number, default: 5): Number of progress stepssampleLLM
prompt
(string): The prompt to send to the LLMmaxTokens
(number, default: 100): Maximum tokens to generategetTinyImage
printEnv
annotatedMessage
messageType
(enum: "error" | "success" | "debug"): Type of message to demonstrate different annotation patternsincludeImage
(boolean, default: false): Whether to include an example image{
"priority": 1.0,
"audience": ["user", "assistant"]
}
getResourceReference
resourceId
(number, 1-100): ID of the resource to referencetype: "resource"
startElicitation
color
(string): Favorite colornumber
(number, 1-100): Favorite numberpets
(enum): Favorite petThe server provides 100 test resources in two formats:
Even numbered resources:
test://static/resource/{even_number}
Odd numbered resources:
test://static/resource/{odd_number}
Resource features:
simple_prompt
complex_prompt
temperature
(number): Temperature settingstyle
(string): Output style preferenceresource_prompt
resourceId
(number): ID of the resource to embed (1-100)The server sends random-leveled log messages every 15 seconds, e.g.:
{
"method": "notifications/message",
"params": {
"level": "info",
"data": "Info-level message"
}
}
Add to your claude_desktop_config.json
:
{
"mcpServers": {
"everything": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-everything"
]
}
}
}
For quick installation, use of of the one-click install buttons below...
For manual installation, add the following JSON block to your User Settings (JSON) file in VS Code. You can do this by pressing Ctrl + Shift + P
and typing Preferences: Open User Settings (JSON)
.
Optionally, you can add it to a file called .vscode/mcp.json
in your workspace. This will allow you to share the configuration with others.
Note that the
mcp
key is not needed in the.vscode/mcp.json
file.
{
"mcp": {
"servers": {
"everything": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-everything"]
}
}
}
}
cd src/everything
npm install
npm run start:sse
cd src/everything
npm install
npm run start:streamableHttp
npm install -g @modelcontextprotocol/server-everything@latest
npx @modelcontextprotocol/server-everything
npx @modelcontextprotocol/server-everything stdio
npx @modelcontextprotocol/server-everything sse
npx @modelcontextprotocol/server-everything streamableHttp
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "everything": { "command": "npx", "args": [ "-y", "@modelcontextprotocol/server-everything" ] } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by GLips
Provides Figma layout and styling information to AI coding agents, enabling one‑shot implementation of designs in any framework.
by idosal
Provides a remote Model Context Protocol server that transforms any public GitHub repository into an up‑to‑date documentation hub, enabling AI assistants to fetch live code and docs, dramatically reducing hallucinations.
by executeautomation
Provides browser automation capabilities using Playwright, enabling language models to interact with web pages, capture screenshots, generate test code, scrape content, and execute JavaScript in a real browser environment.