by drestrepom
Provides a standardized interface to expose each GraphQL query as an MCP tool, enabling seamless interaction with GraphQL APIs.
Mcp Graphql automatically introspects a GraphQL endpoint and creates an MCP tool for every available query. Each tool’s input schema is derived from the query’s parameters, allowing MCP‑compatible clients to call GraphQL operations without writing any boilerplate.
pip install mcp-graphql
(or use uvx mcp-graphql
). Run:
mcp-graphql --api-url="https://api.example.com/graphql" --auth-token="your-token"
import asyncio
from pathlib import Path
from mcp_graphql import serve
asyncio.run(
serve(
api_url="https://api.example.com/graphql",
auth_headers={"Authorization": "Bearer your-token"},
queries_file=Path("queries.gql")
)
)
--max-depth
limit.--auth-headers
.--max-depth
or provide explicit queries.An MCP (Model Context Protocol) server that enables interaction with GraphQL APIs.
MCP GraphQL is a tool that implements the Model Context Protocol (MCP) to provide a standardized interface for interacting with GraphQL APIs. It automatically exposes each GraphQL query as a separate MCP tool, allowing MCP-compatible clients to seamlessly communicate with GraphQL services.
When using uv
no specific installation is needed. We will
use uvx
to directly run mcp-graphql.
Alternatively you can install mcp-graphql
via pip:
pip install mcp-graphql
git clone https://github.com/your-username/mcp_graphql.git
cd mcp_graphql
pip install .
Using uvx:
uvx mcp-graphql --api-url="https://api.example.com/graphql" --auth-token="your-token"
Using pip installation:
mcp-graphql --api-url="https://api.example.com/graphql" --auth-token="your-token"
or
python -m mcp_graphql --api-url="https://api.example.com/graphql" --auth-token="your-token"
--api-url
: GraphQL API URL (required)--auth-token
: Authentication token (optional, can also be set via MCP_AUTH_TOKEN
environment variable)--auth-type
: Authentication type, default is "Bearer" (optional)--auth-headers
: Custom authentication headers in JSON format (optional)--queries-file
: Path to a .gql file containing predefined GraphQL queries (optional)--queries
: Predefined GraphQL queries passed directly as a string (optional)--max-depth
: Maximum depth when auto-generating queries (default: 5)Example with custom headers:
mcp-graphql --api-url="https://api.example.com/graphql" --auth-headers='{"Authorization": "Bearer token", "X-API-Key": "key"}'
Example with predefined queries file:
mcp-graphql --api-url="https://api.example.com/graphql" --queries-file="./queries.gql"
Example passing queries directly as a string (use single quotes to avoid shell conflicts):
mcp-graphql --api-url="https://api.example.com/graphql" --queries='query Hello { hello }'
If neither --queries-file
nor --queries
is supplied, mcp-graphql will
automatically build a query by introspecting the GraphQL schema and selecting
all scalar fields up to a configurable depth. This is convenient for
quickly exploring an API, but it has two main drawbacks:
The --max-depth
option mitigates the first issue by limiting the recursion
depth (default = 5). Even so, the best practice is to define the exact
queries you need through --queries-file
or --queries
. In doing so:
Example using --max-depth
to limit the auto-generated query to depth 2:
mcp-graphql --api-url="https://api.example.com/graphql" --max-depth 2
For production workloads you should supply your own queries:
# Using a file
mcp-graphql --api-url="https://api.example.com/graphql" \
--queries-file="./queries.gql"
# Or as a string
mcp-graphql --api-url="https://api.example.com/graphql" \
--queries='query UserMini { viewer { id name } }'
The queries.gql
file should contain one or more named operations, e.g.:
# queries.gql
query GetUser($id: ID!) {
user(id: $id) {
id
name
email
}
}
query ListPosts {
posts {
id
title
}
}
import asyncio
from pathlib import Path
from mcp_graphql import serve
auth_headers = {"Authorization": "Bearer your-token"}
api_url = "https://api.example.com/graphql"
queries_file = Path("queries.gql") # optional, set to None to expose all queries
asyncio.run(serve(api_url, auth_headers, queries_file=queries_file))
Passing the queries directly as a string from code:
queries_str = """
query Hello($name: String!) {
hello(name: $name)
}
"""
asyncio.run(serve(api_url, auth_headers, queries=queries_str, max_depth=3))
Add to your Claude settings:
"mcpServers": {
"graphql": {
"command": "uvx",
"args": ["mcp-graphql", "--api-url", "https://api.example.com/graphql"]
}
}
"mcpServers": {
"graphql": {
"command": "docker",
"args": ["run", "-i", "--rm", "mcp/graphql", "--api-url", "https://api.example.com/graphql"]
}
}
"mcpServers": {
"graphql": {
"command": "python",
"args": ["-m", "mcp_graphql", "--api-url", "https://api.example.com/graphql"]
}
}
MCP GraphQL automatically:
When a tool is called, the server:
# Create virtual environment using uv
uv venv
# Install dependencies
uv sync
ruff check .
When working locally you can start the MCP GraphQL server with hot-reloading and inspect its tools using the Model Context Protocol Inspector:
npx "@modelcontextprotocol/inspector" uv run -n --project $PWD mcp-graphql --api-url http://localhost:3010/graphql
Replace http://localhost:3010/graphql
with the URL of your local GraphQL endpoint if it differs.
This project is licensed under the MIT License. See the LICENSE file for details.
Contributions are welcome. Please feel free to submit a Pull Request or open an Issue.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "graphql": { "command": "python", "args": [ "-m", "mcp_graphql", "--api-url", "https://api.example.com/graphql" ], "env": { "MCP_AUTH_TOKEN": "<YOUR_TOKEN>" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.