by asusevski
Provides real-time Dota 2 statistics, match data, player profiles and related information via a Model Context Protocol interface for LLMs and AI assistants.
OpenDota MCP Server implements a Model Context Protocol (MCP) service that exposes the OpenDota REST API as a set of standardized tools. It enables language models and AI assistants to fetch up‑to‑date Dota 2 data—including player stats, match details, hero rankings, and professional match information—through simple function calls.
git clone https://github.com/asusevski/opendota-mcp-server.git
cd opendota-mcp-server
./scripts/setup_env.sh
uv add pyproject.toml && uv pip install -e ".[dev]"
export OPENDOTA_API_KEY=your_api_key_here
python -m src.opendota_server.server
python -m src.client
) or by integrating the MCP endpoint into your own LLM workflow.get_player_by_id
, get_match_data
, search_player
).Q: Do I need an OpenDota API key?
Q: Which Python version is required?
pyproject.toml
).Q: How do I run the server on WSL for Claude Desktop?
claude_desktop_config.json
that launches the server via wsl.exe
as shown in the README.Q: Can I develop or extend the server?
uv pip install -e ".[dev]"
) and modify the tools in src/opendota_server
.Q: Where are the MCP tool definitions?
src/opendota_server/tools.py
and are automatically loaded by the MCP framework when the server starts.A Model Context Protocol (MCP) server implementation for accessing OpenDota API data. This server enables LLMs and AI assistants to retrieve real-time Dota 2 statistics, match data, player information, and more through a standard interface.
# Clone the repository
git clone https://github.com/asusevski/opendota-mcp-server.git
cd opendota-mcp-server
# Option 1: Automated setup (works with bash, zsh, and other shells)
./scripts/setup_env.sh
# Option 2: Manual installation with uv
uv add pyproject.toml
# For development dependencies
uv pip install -e ".[dev]"
export OPENDOTA_API_KEY=your_api_key_here
python -m src.opendota_server.server
Follow this: https://modelcontextprotocol.io/quickstart/user
If you use WSL, assuming you have cloned the repo and set up the python environment, this is how I wrote the claude_desktop_config.json:
{
"mcpServers": {
"opendota": {
"command": "wsl.exe",
"args": [
"--",
"bash",
"-c",
"cd ~/opendota-mcp-server && source .venv/bin/activate && python src/opendota_server/server.py"
]
}
}
}
python -m src.client
MIT
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.