by da1z
Provides documentation access to LLMs via a Model Context Protocol server, fetching and parsing docs from URLs or local files.
DocsMCP enables Large Language Models to retrieve and query documentation from configured sources, supporting both remote URLs and local file paths.
.cursor/mcp.json
file with a server entry that runs npx docsmcp
and specifies a --source
argument..vscode/mcp.json
configuration defining a stdio
server that runs npx docsmcp
with the appropriate --source
flag.npx docsmcp --source=Name|URL_or_path
.npx
, no global installation required.Q: Do I need to install anything globally?
A: No. The server runs with npx
, which downloads and executes the package on demand.
Q: How are spaces handled in source definitions?
A: Wrap the entire --source
string in single quotes, e.g., '--source=Model Context Protocol (MCP)|https://example.com/doc.txt'
.
Q: Can I add multiple sources?
A: Yes. Include additional --source
arguments when launching the server.
Q: What file formats are supported? A: Any text‑based format that can be fetched via HTTP(S) or read from the filesystem.
Q: Is there a license? A: The project is released under the MIT license.
A Model Context Protocol (MCP) server that provides documentation access to LLMs.
DocsMCP enables Large Language Models (LLMs) to access and query documentation from specified sources, whether from local files or remote URLs. It uses the Model Context Protocol (MCP) to facilitate communication between the LLM and documentation sources.
You can also configure DocsMCP in your Cursor project by creating a .cursor/mcp.json
file:
{
"mcpServers": {
"docs-mcp": {
"command": "npx",
"args": [
"-y",
"docsmcp",
"'--source=Model Context Protocol (MCP)|https://modelcontextprotocol.io/llms-full.txt'"
]
}
}
}
This configuration allows Cursor AI to use the documentation MCP server automatically when you open your project.
When specifying a source that contains spaces, ensure to wrap the entire string in quotes. For example: '--source=Model Context Protocol (MCP)|https://modelcontextprotocol.io/llms-full.txt'
You can configure DocsMCP in VS Code by adding a configuration to your .vscode/mcp.json
file:
{
"servers": {
"documentation-mcp-server": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"docsmcp",
"--source=Model Context Protocol (MCP)|https://modelcontextprotocol.io/llms-full.txt"
]
}
}
}
This configuration allows VS Code extensions that support MCP to use the documentation server automatically.
The MCP server provides two main tools:
Lists all available documentation sources that have been configured.
Fetches and parses documentation from a given URL or local file path.
Parameters:
url
: The URL or file path to fetch the documentation fromPlease log in to share your review and rating for this MCP.
{ "mcpServers": { "docs-mcp": { "command": "npx", "args": [ "-y", "docsmcp", "--source=Model Context Protocol (MCP)|https://modelcontextprotocol.io/llms-full.txt" ], "env": {} } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.