by hannesj
Exposes OpenAPI specification data to LLMs, enabling interactive exploration of endpoints, parameters, request/response schemas, components, and examples through a set of dedicated tools.
Provides a Model Context Protocol (MCP) server that loads an OpenAPI JSON or YAML file and offers LLM‑friendly endpoints for querying the API definition. The server returns information in YAML format, which is easier for language models to parse.
Command‑line
# default openapi.yaml in the current folder
npx -y mcp-openapi-schema
# specify a schema file (relative path)
npx -y mcp-openapi-schema ../petstore.json
# specify a schema file (absolute path)
npx -y mcp-openapi-schema /absolute/path/to/api-spec.yaml
# show help
npx -y mcp-openapi-schema --help
Claude Desktop integration – add an entry to claude_desktop_config.json
with the command and arguments shown above.
Claude Code integration – register the server via the claude mcp add
command, then query the API through natural‑language prompts.
list-endpoints
– returns all paths and HTTP methods with summariesget-endpoint
– detailed view of a specific operation, including parameters and responsesget-request-body
/ get-response-schema
– fetches request and response modelslist-components
/ get-component
– inspects reusable schemas, parameters, responses, etc.list-security-schemes
– enumerates authentication mechanismsget-examples
– provides example payloads for endpoints or componentssearch-schema
– full‑text search across the specificationQ: Do I need to install anything globally?
A: No. The server runs via npx
, which fetches the package on demand.
Q: Which OpenAPI versions are supported? A: Any version that can be parsed by the underlying library (typically 3.0.x and 3.1.x).
Q: Can I run the server with a custom port? A: The package uses the default MCP port conventions; custom ports can be set via environment variables if needed (not documented in the README).
Q: How does the server return data? A: Responses are serialized in YAML to make them easier for LLMs to read.
Q: Is the server compatible with Claude Desktop and Claude Code? A: Yes – the README includes configuration snippets for both integrations.
A Model Context Protocol (MCP) server that exposes OpenAPI schema information to Large Language Models (LLMs) like Claude. This server allows an LLM to explore and understand OpenAPI specifications through a set of specialized tools.
Run the MCP server with a specific schema file:
# Use the default openapi.yaml in current directory
npx -y mcp-openapi-schema
# Use a specific schema file (relative path)
npx -y mcp-openapi-schema ../petstore.json
# Use a specific schema file (absolute path)
npx -y mcp-openapi-schema /absolute/path/to/api-spec.yaml
# Show help
npx -y mcp-openapi-schema --help
To use this MCP server with Claude Desktop, edit your claude_desktop_config.json
configuration file:
{
"mcpServers": {
"OpenAPI Schema": {
"command": "npx",
"args": ["-y", "mcp-openapi-schema", "/ABSOLUTE/PATH/TO/openapi.yaml"]
}
}
}
Location of the configuration file:
~/Library/Application Support/Claude/claude_desktop_config.json
$env:AppData\Claude\claude_desktop_config.json
To use this MCP server with Claude Code CLI, follow these steps:
Add the OpenAPI Schema MCP server to Claude Code
# Basic syntax
claude mcp add openapi-schema npx -y mcp-openapi-schema
# Example with specific schema
claude mcp add petstore-api npx -y mcp-openapi-schema ~/Projects/petstore.yaml
Verify the MCP server is registered
# List all configured servers
claude mcp list
# Get details for your OpenAPI schema server
claude mcp get openapi-schema
Remove the server if needed
claude mcp remove openapi-schema
Use the tool in Claude Code
Once configured, you can invoke the tool in your Claude Code session by asking questions about the OpenAPI schema.
Tips:
-s
or --scope
flag with project
(default) or global
to specify where the configuration is storedThe server provides the following tools for LLMs to interact with OpenAPI schemas:
list-endpoints
: Lists all API paths and their HTTP methods with summaries in a nested object structureget-endpoint
: Gets detailed information about a specific endpoint including parameters and responsesget-request-body
: Gets the request body schema for a specific endpoint and methodget-response-schema
: Gets the response schema for a specific endpoint, method, and status codeget-path-parameters
: Gets the parameters for a specific pathlist-components
: Lists all schema components (schemas, responses, parameters, etc.)get-component
: Gets detailed definition for a specific componentlist-security-schemes
: Lists all available security schemesget-examples
: Gets examples for a specific component or endpointsearch-schema
: Searches across paths, operations, and schemasExample queries to try:
What endpoints are available in this API?
Show me the details for the POST /pets endpoint.
What parameters does the GET /pets/{petId} endpoint take?
What is the request body schema for creating a new pet?
What response will I get from the DELETE /pets/{petId} endpoint?
What schemas are defined in this API?
Show me the definition of the Pet schema.
What are the available security schemes for this API?
Are there any example responses for getting a pet by ID?
Search for anything related to "user" in this API.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "OpenAPI Schema": { "command": "npx", "args": [ "-y", "mcp-openapi-schema" ], "env": {} } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.