by GongRzhe
Creates configurable MCP servers on the fly and generates complete, runnable Python code for FastMCP services, allowing rapid prototyping and deployment of custom tool‑rich APIs.
MCP Server Creator enables developers to define new MCP server configurations programmatically or via CLI, add custom tools and resources, and instantly produce fully functional Python server code.
pip install mcp-server-creator
(or use uvx mcp-server-creator
).uvx mcp-server-creator
or python -m mcp_server_creator
to launch the interactive server.create_server
, add_tool
, add_resource
, generate_server_code
, and save_server
to build servers within your own scripts..py
file in one step.Q: Do I need FastMCP installed separately? A: FastMCP is a dependency; installing the package pulls the required version (>=0.1.0).
Q: Can I create async tools?
A: Yes, set is_async=True
when calling add_tool
and provide an async implementation.
Q: How do I persist servers between sessions?
A: Use save_server
to write the generated code to a file, then import or run that file later.
Q: Is there a way to list all in‑memory servers?
A: Call list_servers()
to retrieve the identifiers of all configured servers.
Q: Can I run the creator itself as an MCP server? A: Absolutely – it can be registered in Claude Desktop or any MCP client using the standard server configuration format.
A powerful Model Context Protocol (MCP) server that creates other MCP servers! This meta-server provides tools for dynamically generating FastMCP server configurations and Python code.
pip install mcp-server-creator
uvx mcp-server-creator
python -m mcp_server_creator
The MCP Server Creator is itself an MCP server that can be used with Claude Desktop or any MCP client.
Add to your claude_desktop_config.json
:
{
"mcpServers": {
"mcp-server-creator": {
"command": "uvx",
"args": ["mcp-server-creator"]
}
}
}
from mcp_server_creator import create_server, add_tool, generate_server_code
# Create a new server configuration
result = create_server(
name="My API Server",
description="A custom API integration server",
version="1.0.0"
)
# Add a tool
add_tool(
server_id="my_api_server",
tool_name="fetch_data",
description="Fetch data from the API",
parameters=[{"name": "endpoint", "type": "str"}],
return_type="dict",
implementation='return {"data": f"Fetched from {endpoint}"}'
)
# Generate the code
code = generate_server_code("my_api_server")
print(code)
create_server
: Create a new MCP server configurationlist_servers
: List all server configurations in memoryget_server_details
: Get detailed information about a specific serveradd_tool
: Add a tool to an existing server
add_resource
: Add a resource to an existing server
generate_server_code
: Generate complete Python code for a serversave_server
: Save generated server code to a filecreate_example_server
: Create a complete example Weather Serviceimport asyncio
from mcp_server_creator import create_server, add_tool, add_resource, save_server
async def create_weather_service():
# Create the server
create_server(
name="Weather Service",
description="Get weather information",
version="1.0.0"
)
# Add a weather tool
add_tool(
server_id="weather_service",
tool_name="get_weather",
description="Get current weather for a city",
parameters=[
{"name": "city", "type": "str"},
{"name": "units", "type": "str", "default": '"celsius"'}
],
return_type="dict",
is_async=True,
implementation='''
# Your weather API logic here
return {
"city": city,
"temperature": 20,
"units": units,
"condition": "sunny"
}'''
)
# Add a resource
add_resource(
server_id="weather_service",
uri="weather://{city}/current",
name="Current Weather",
description="Get current weather data",
is_template=True,
implementation='return {"city": city, "weather": "sunny"}'
)
# Save to file
await save_server("weather_service", "weather_service.py")
# Run the creation
asyncio.run(create_weather_service())
git clone https://github.com/GongRzhe/mcp-server-creator.git
cd mcp-server-creator
pip install -e .
python test_mcp_creator.py
MIT License - see LICENSE file for details.
Contributions are welcome! Please feel free to submit a Pull Request.
GongRzhe - gongrzhe@gmail.com
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.