by kenliao94
Enables AI agents to manage RabbitMQ brokers and interact with queues/messages through Model Context Protocol tools, wrapping admin APIs and providing Pika‑based message operations.
It provides an MCP‑compatible interface for RabbitMQ, exposing broker administration and message‑level actions as tools that can be invoked by AI agents.
pip install mcp-server-rabbitmq
) or run it directly with uvx
/uv
as shown in the README.mcpServers
configuration, supplying the same arguments.list_queues
, publish_message
, or create_user
to control the broker.rabbitmqadmin
commands are exposed as MCP tools.Q: Can I change the broker after the server has started? A: Yes, the MCP tools accept connection parameters, allowing the AI to point to a different host, port, or credential set during a session.
Q: Do I need TLS for the AMQP connection?
A: Set --use-tls true
when launching the server if your broker uses amqps
; otherwise leave it false.
Q: How is authentication handled for the HTTP endpoint?
A: The server uses FastMCP's BearerAuthProvider
; configure your IdP and pass bearer tokens in client requests.
Q: Is there parity with rabbitmqadmin
?
A: Full parity is planned on the roadmap; current implementation covers the most common admin commands.
Q: Can I use OAuth instead of basic auth? A: OAuth support is on the roadmap; currently only username/password authentication is available.
A Model Context Protocol server implementation for RabbitMQ operation.
This MCP servers wraps admin APIs of a RabbitMQ broker as MCP tools. It also uses Pika to interact with RabbitMQ to operate at the message level. You can also specify a different RabbitMQ broker that you want to connect to mid-conversation (default is configured during server initialization).
BearerAuthProvider
You can start a remote RabbitMQ MCP server by configuring your own IdP and 3rd party authorization provider.
The package is available on PyPI, you can use uvx without having to fork and build the MCP server locally first.
To install RabbitMQ MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @kenliao94/mcp-server-rabbitmq --client claude
https://smithery.ai/server/@kenliao94/mcp-server-rabbitmq
https://pypi.org/project/mcp-server-rabbitmq/
Use uvx directly in your MCP client config
{
"mcpServers": {
"rabbitmq": {
"command": "uvx",
"args": [
"mcp-server-rabbitmq@latest",
"--rabbitmq-host",
"<hostname ex. test.rabbit.com, localhost>",
"--port",
"<port number ex. 5672>",
"--username",
"<rabbitmq username>",
"--password",
"<rabbitmq password>",
"--api-port",
"<port number for the admin API, default to 15671>"
"--use-tls",
"<true if uses amqps, false otherwise>"
]
}
}
}
{
"mcpServers": {
"rabbitmq": {
"command": "uv",
"args": [
"--directory",
"/path/to/repo/mcp-server-rabbitmq",
"run",
"mcp-server-rabbitmq",
"--rabbitmq-host",
"<hostname ex. test.rabbit.com, localhost>",
"--port",
"<port number ex. 5672>",
"--username",
"<rabbitmq username>",
"--password",
"<rabbitmq password>",
"--use-tls",
"<true if uses amqps, false otherwise>"
]
}
}
}
rabbitmqadmin
# Clone the repository
git clone https://github.com/kenliao94/mcp-server-rabbitmq.git
cd mcp-server-rabbitmq
# Install pre-commit hooks
pre-commit install
pytest
This project uses ruff for linting and formatting:
# Run linter
ruff check .
# Run formatter
ruff format .
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.