by johnpapa
Enables VS Code agents to fetch Peacock extension documentation and answer theming‑related questions through an MCP server.
Provides an MCP server that retrieves the official Peacock extension documentation and exposes a tool (fetch_peacock_docs) allowing AI agents to answer queries about VS Code accent colors and theming.
{
"mcp": {
"servers": {
"peacock-mcp": {
"command": "npx",
"args": ["-y", "@johnpapa/peacock-mcp"],
"env": {}
}
}
},
"chat.mcp.discovery.enabled": true
}
npm install && npm run build, then start with:
npx @modelcontextprotocol/inspector node build/index.js
fetch_peacock_docs accepts a prompt and returns a concise answer.Q: Do I need an API key? A: No external API keys are required; the server uses publicly available documentation.
Q: Which command should I use for installation?
A: Prefer the npx approach: npx -y @johnpapa/peacock-mcp.
Q: Can I run the server in a container?
A: Yes, use the Docker badge commands which run docker run -i --rm mcp/peacock-mcp.
Q: How do I expose the server to all repositories?
A: Add the server definition to the VS Code User Settings JSON under the mcp.servers block and enable chat.mcp.discovery.enabled.
Q: What if the response is filtered by the Responsible AI Service? A: Re‑run the query or rephrase the prompt; the server itself does not impose content filters.
Features • Tools • Setup • Configuring an MCP Host
MCP Server for the Peacock extension for VS Code, coloring your world, one Code editor at a time. The main goal of the project is to show how an MCP server can be used to interact with APIs.
Note: All data used by this MCP server is fetched from the official Peacock documentation.
fetch_peacock_docs 🔍🦸♂️prompt (query): The question about Peacock.Install Peacock for VS Code HERE.
Note: If you already have the MCP server enabled with Claude Desktop, add
chat.mcp.discovery.enabled: truein your VS Code settings and it will discover existing MCP server lists.
If you want to associate the MCP server with a specific repo, create a .vscode/mcp.json file with this content:
{
"inputs": [],
"servers": {
"peacock-mcp": {
"command": "npx",
// "command": "node",
"args": [
"-y",
"@johnpapa/peacock-mcp"
// "_git/peacock-mcp/dist/index.js"
],
"env": {}
}
}
}
If you want to associate the MCP server with all repos, add the following to your VS Code User Settings JSON:
"mcp": {
"servers": {
"peacock-mcp": {
"command": "npx",
// "command": "node",
"args": [
"-y",
"@johnpapa/peacock-mcp"
// "/Users/papa/_git/peacock-mcp/dist/index.js"
// "_git/peacock-mcp/dist/index.js"
],
"env": {}
}
}
}
"chat.mcp.discovery.enabled": true,
Note: For quick installation, click the install buttons at the top of this README.
To manually install the Peacock MCP server in VS Code, follow these steps:
Cmd+Shift+P (macOS) or Ctrl+Shift+P (Windows/Linux) and searching for "Preferences: Open User Settings (JSON)"{
"mcp": {
"servers": {
"peacock-mcp": {
"command": "npx",
"args": ["-y", "@johnpapa/peacock-mcp"],
"env": {}
}
}
},
"chat.mcp.discovery.enabled": true
}
For VS Code Stable:
code --add-mcp '{"name":"peacock-mcp","command":"npx","args":["-y","@johnpapa/peacock-mcp"],"env":{}}'
For VS Code Insiders:
code-insiders --add-mcp '{"name":"peacock-mcp","command":"npx","args":["-y","@johnpapa/peacock-mcp"],"env":{}}'
To install Peacock MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @johnpapa/peacock-mcp --client claude
If you'd like to run MCP Inspector locally to test the server, follow these steps:
Clone this repository:
git clone https://github.com/johnpapa/peacock-mcp
Install the required dependencies and build the project.
npm install
npm run build
(Optional) To try out the server using MCP Inspector run the following command:
# Start the MCP Inspector
npx @modelcontextprotocol/inspector node build/index.js
Visit the MCP Inspector URL shown in the console in your browser. Change Arguments to dist/index.js and select Connect. Select List Tools to see the available tools.
Now that the mcp server is discoverable, open GitHub Copilot and select the Agent mode (not Chat or Edits).
Select the "refresh" button in the Copilot chat text field to refresh the server list.
Select the "🛠️" button to see all the possible tools, including the ones from this repo.
Put a question in the chat that would naturally invoke one of the tools, for example:
How do I set my VS Code accent colors?
Note: If you see "Sorry, the response was filtered by the Responsible AI Service. Please rephrase your prompt and try again.", try running it again or rephrasing the prompt.
Please log in to share your review and rating for this MCP.
{
"mcpServers": {
"peacock-mcp": {
"command": "npx",
"args": [
"-y",
"@johnpapa/peacock-mcp"
],
"env": {}
}
}
}claude mcp add peacock-mcp npx -y @johnpapa/peacock-mcpExplore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.