by baryhuang
Enables Claude to discover and call any API endpoint via semantic search, chunking OpenAPI specifications and providing instant request execution for private APIs.
Enables Claude to discover and call any API endpoint through natural‑language queries by semantically searching OpenAPI specifications. It chunks large OpenAPI docs into per‑endpoint vectors, stores them in an in‑memory FAISS index, and returns full endpoint schemas for request generation.
OPENAPI_JSON_DOCS_URL
environment variable.MCP_API_PREFIX
to customize the tool namespace and GLOBAL_TOOL_PROMPT
to prepend custom description to tool metadata.{prefix}_api_request_schema
and {prefix}_make_request
tools.MCP_API_PREFIX
.Q: Why is the Docker image so large?
A: The image includes the MiniLM model and optional pre‑downloaded weights; building a slimmer image or mounting the model externally can reduce size.
Q: What is the cold‑start penalty?
A: Loading the embedding model takes about 15 seconds on first start; subsequent requests are instantaneous.
Q: Which platforms are supported?
A: Official Docker images for linux/amd64 and linux/arm64. Linux/arm/v7 is not supported due to Transformer library constraints.
Q: How do I customize the tool descriptions?
A: Set GLOBAL_TOOL_PROMPT
with any prefixed text; it is concatenated to every tool’s description to guide Claude’s selection.
Q: Can I run multiple APIs on the same host?
A: Yes, use the multi‑instance JSON example and provide distinct MCP_API_PREFIX
values for each container.
Customize through environment variables. GLOBAL_TOOL_PROMPT is IMPORTANT!
OPENAPI_JSON_DOCS_URL
: URL to the OpenAPI specification JSON (defaults to https://api.staging.readymojo.com/openapi.json)MCP_API_PREFIX
: Customizable tool namespace (default "any_openapi"):
# Creates tools: custom_api_request_schema and custom_make_request
docker run -e MCP_API_PREFIX=finance ...
GLOBAL_TOOL_PROMPT
: Optional text to prepend to all tool descriptions. This is crucial to make the Claude select and not select your tool accurately.
# Adds "Access to insights apis for ACME Financial Services abc.com . " to the beginning of all tool descriptions
docker run -e GLOBAL_TOOL_PROMPT="Access to insights apis for ACME Financial Services abc.com ." ...
Why I create this: I want to serve my private API, whose swagger openapi docs is a few hundreds KB in size.
Eventually I came down to this solution:
Boom, Claude now knows what API to call, with the full parameters!
Wait I have to create another tool in this server to make the actual restful request, because "fetch" server simply don't work, and I don't want to debug why.
https://github.com/user-attachments/assets/484790d2-b5a7-475d-a64d-157e839ad9b0
Technical highlights:
query -> [Embedding] -> FAISS TopK -> OpenAPI docs -> MCP Client (Claude Desktop)
MCP Client -> Construct OpenAPI Request -> Execute Request -> Return Response
Here is the multi-instance config example. I design it so it can more flexibly used for multiple set of apis:
{
"mcpServers": {
"finance_openapi": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"OPENAPI_JSON_DOCS_URL=https://api.finance.com/openapi.json",
"-e",
"MCP_API_PREFIX=finance",
"-e",
"GLOBAL_TOOL_PROMPT='Access to insights apis for ACME Financial Services abc.com .'",
"buryhuang/mcp-server-any-openapi:latest"
]
},
"healthcare_openapi": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"OPENAPI_JSON_DOCS_URL=https://api.healthcare.com/openapi.json",
"-e",
"MCP_API_PREFIX=healthcare",
"-e",
"GLOBAL_TOOL_PROMPT='Access to insights apis for Healthcare API services efg.com .",
"buryhuang/mcp-server-any-openapi:latest"
]
}
}
}
In this example:
https://api.finance.com
for finance APIshttps://api.healthcare.com
for healthcare APIsAPI_REQUEST_BASE_URL
environment variable:{
"mcpServers": {
"finance_openapi": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"OPENAPI_JSON_DOCS_URL=https://api.finance.com/openapi.json",
"-e",
"API_REQUEST_BASE_URL=https://api.finance.staging.com",
"-e",
"MCP_API_PREFIX=finance",
"-e",
"GLOBAL_TOOL_PROMPT='Access to insights apis for ACME Financial Services abc.com .'",
"buryhuang/mcp-server-any-openapi:latest"
]
}
}
}
Claude Desktop Project Prompt:
You should get the api spec details from tools financial_api_request_schema
You task is use financial_make_request tool to make the requests to get response. You should follow the api spec to add authorization header:
Authorization: Bearer <xxxxxxxxx>
Note: The base URL will be returned in the api_request_schema response, you don't need to specify it manually.
In chat, you can do:
Get prices for all stocks
To install Scalable OpenAPI Endpoint Discovery and API Request Tool for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @baryhuang/mcp-server-any-openapi --client claude
pip install mcp-server-any-openapi
The server provides the following tools (where {prefix}
is determined by MCP_API_PREFIX
):
Get API endpoint schemas that match your intent. Returns endpoint details including path, method, parameters, and response formats.
Input Schema:
{
"query": {
"type": "string",
"description": "Describe what you want to do with the API (e.g., 'Get user profile information', 'Create a new job posting')"
}
}
Essential for reliable execution with complex APIs where simplified implementations fail. Provides:
Input Schema:
{
"method": {
"type": "string",
"description": "HTTP method (GET, POST, PUT, DELETE, PATCH)",
"enum": ["GET", "POST", "PUT", "DELETE", "PATCH"]
},
"url": {
"type": "string",
"description": "Fully qualified API URL (e.g., https://api.example.com/users/123)"
},
"headers": {
"type": "object",
"description": "Request headers (optional)",
"additionalProperties": {
"type": "string"
}
},
"query_params": {
"type": "object",
"description": "Query parameters (optional)",
"additionalProperties": {
"type": "string"
}
},
"body": {
"type": "object",
"description": "Request body for POST, PUT, PATCH (optional)"
}
}
Response Format:
{
"status_code": 200,
"headers": {
"content-type": "application/json",
...
},
"body": {
// Response data
}
}
Official images support 3 platforms:
# Build and push using buildx
docker buildx create --use
docker buildx build --platform linux/amd64,linux/arm64 \
-t buryhuang/mcp-server-any-openapi:latest \
--push .
Control tool names through MCP_API_PREFIX
:
# Produces tools with "finance_api" prefix:
docker run -e MCP_API_PREFIX=finance_ ...
docker pull buryhuang/mcp-server-any-openapi:latest
docker build -t mcp-server-any-openapi .
docker run \
-e OPENAPI_JSON_DOCS_URL=https://api.example.com/openapi.json \
-e MCP_API_PREFIX=finance \
buryhuang/mcp-server-any-openapi:latest
EndpointSearcher: Core class that handles:
Server Implementation:
python -m mcp_server_any_openapi
Configure the MCP server in your Claude Desktop settings:
{
"mcpServers": {
"any_openapi": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"OPENAPI_JSON_DOCS_URL=https://api.example.com/openapi.json",
"-e",
"MCP_API_PREFIX=finance",
"-e",
"GLOBAL_TOOL_PROMPT='Access to insights apis for ACME Financial Services abc.com .",
"buryhuang/mcp-server-any-openapi:latest"
]
}
}
}
git checkout -b feature/amazing-feature
)git commit -m 'Add some amazing feature'
)git push origin feature/amazing-feature
)This project is licensed under the terms included in the LICENSE file.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.