by modelscope
Provides AI agents and chatbots with direct access to ModelScope’s ecosystem, enabling image generation, resource discovery, and contextual information via MCP.
ModelScope MCP Server empowers AI agents and conversational assistants to interact with ModelScope’s extensive collection of AI models, datasets, studios, research papers, and other resources. It exposes capabilities such as text‑to‑image, image‑to‑image generation, advanced resource search, and access to user context, all through the Model Context Protocol (MCP) interface.
uv run modelscope-mcp-server (default stdio transport) or uv run modelscope-mcp-server --transport http --port 8000 for HTTP/SSE transport.http://127.0.0.1:8000/mcp/ or to the Docker image ghcr.io/modelscope/modelscope-mcp-server.tools/list, resources/search, or image‑generation calls from any MCP‑compatible tool (Claude Desktop, Cursor, VS Code, etc.).Q: Do I need Docker to run the server?
A: No. The server runs natively with Python (uv run modelscope-mcp-server). Docker is a convenient alternative.
Q: Which transport protocols are supported? A: Standard stdio (default), HTTP, and SSE transports are available.
Q: How are updates released? A: GitHub Actions automatically create releases, publish to PyPI, and push a Docker image whenever a new version tag is pushed.
Q: Can I use the server with other MCP‑compatible clients? A: Yes. Any client that follows the MCP JSON Configuration Standard can connect by pointing to the server URL or using the command configuration.
Q: Where can I find the API token? A: In the ModelScope web UI under Home → Access Tokens.
English | 中文
Empowers AI agents and chatbots with direct access to ModelScope's rich ecosystem of AI resources. From generating images to discovering cutting-edge models, datasets, apps and research papers, this MCP server makes ModelScope's vast collection of tools and services accessible through simple conversational interactions.
For a quick trial or a hosted option, visit the project page on the ModelScope MCP Plaza.
📖 For detailed instructions, refer to the ModelScope Token Documentation
Add the following JSON configuration to your MCP client's configuration file:
{
"mcpServers": {
"modelscope-mcp-server": {
"command": "uvx",
"args": ["modelscope-mcp-server"],
"env": {
"MODELSCOPE_API_TOKEN": "your-api-token"
}
}
}
}
Or, you can use the pre-built Docker image:
{
"mcpServers": {
"modelscope-mcp-server": {
"command": "docker",
"args": [
"run", "--rm", "-i",
"-e", "MODELSCOPE_API_TOKEN",
"ghcr.io/modelscope/modelscope-mcp-server"
],
"env": {
"MODELSCOPE_API_TOKEN": "your-api-token"
}
}
}
}
Refer to the MCP JSON Configuration Standard for more details.
This format is widely adopted across the MCP ecosystem:
~/.claude/claude_desktop_config.json~/.cursor/mcp.json.vscode/mcp.jsonClone and Setup:
git clone https://github.com/modelscope/modelscope-mcp-server.git
cd modelscope-mcp-server
uv sync
Activate Environment (or use your IDE):
source .venv/bin/activate # Linux/macOS
Set Your API Token (see Quick Start section for token setup):
export MODELSCOPE_API_TOKEN="your-api-token"
# Or create .env file: echo 'MODELSCOPE_API_TOKEN="your-api-token"' > .env
Run a quick demo to explore the server's capabilities:
uv run python demo.py
Use the --full flag for comprehensive feature demonstration:
uv run python demo.py --full
# Standard stdio transport (default)
uv run modelscope-mcp-server
# Streamable HTTP transport for web integration
uv run modelscope-mcp-server --transport http
# HTTP/SSE transport with custom port (default: 8000)
uv run modelscope-mcp-server --transport [http/sse] --port 8080
For HTTP/SSE mode, connect using a local URL in your MCP client configuration:
{
"mcpServers": {
"modelscope-mcp-server": {
"url": "http://127.0.0.1:8000/mcp/"
}
}
}
You can also debug the server using the MCP Inspector tool:
# Run in UI mode with stdio transport (can switch to HTTP/SSE in the Web UI as needed)
npx @modelcontextprotocol/inspector uv run modelscope-mcp-server
# Run in CLI mode with HTTP transport (can do operations across tools, resources, and prompts)
npx @modelcontextprotocol/inspector --cli http://127.0.0.1:8000/mcp/ --transport http --method tools/list
# Run all tests
uv run pytest
# Run specific test file
uv run pytest tests/test_search_papers.py
# With coverage report
uv run pytest --cov=src --cov-report=html
This project uses GitHub Actions for automated CI/CD workflows that run on every push and pull request:
Run the same checks locally before submitting PRs:
# Install and run pre-commit hooks
uv run pre-commit install
uv run pre-commit run --all-files
# Run tests
uv run pytest
Monitor CI status in the Actions tab.
This project uses GitHub Actions for automated release management. To create a new release:
Update version using the bump script:
uv run python scripts/bump_version.py [patch|minor|major]
# Or set specific version: uv run python scripts/bump_version.py set 1.2.3.dev1
Commit and tag (follow the script's output instructions):
git add src/modelscope_mcp_server/_version.py
git commit -m "chore: bump version to v{version}"
git tag v{version} && git push origin v{version}
Automated publishing - GitHub Actions will automatically:
We welcome contributions! Please ensure your PRs:
This project is licensed under the Apache License (Version 2.0).
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process.
by danny-avila
Provides a self‑hosted ChatGPT‑style interface supporting numerous AI models, agents, code interpreter, image generation, multimodal interactions, and secure multi‑user authentication.
by block
Automates engineering tasks on local machines, executing code, building projects, debugging, orchestrating workflows, and interacting with external APIs using any LLM.
by RooCodeInc
Provides an autonomous AI coding partner inside the editor that can understand natural language, manipulate files, run commands, browse the web, and be customized via modes and instructions.
by pydantic
A Python framework that enables seamless integration of Pydantic validation with large language models, providing type‑safe agent construction, dependency injection, and structured output handling.
by lastmile-ai
Build effective agents using Model Context Protocol and simple, composable workflow patterns.
by mcp-use
A Python SDK that simplifies interaction with MCP servers and enables developers to create custom agents with tool‑calling capabilities.
by nanbingxyz
A cross‑platform desktop AI assistant that connects to major LLM providers, supports a local knowledge base, and enables tool integration via MCP servers.
by gptme
Provides a personal AI assistant that runs directly in the terminal, capable of executing code, manipulating files, browsing the web, using vision, and interfacing with various LLM providers.