by jokemanfire
Manages Containerd CRI interfaces via an MCP server, handling container lifecycle, pod sandbox operations, and image management.
Mcp Containerd provides an MCP server implementation that interfaces directly with Containerd's CRI APIs. It enables AI agents or other MCP clients to perform container and image operations through natural‑language commands.
cargo build --release
cargo run --release
– the server connects by default to unix:///run/containerd/containerd.sock
.simple-chat-client
example (found in the Rust SDK repository) to send textual prompts; the server translates them into CRI calls.list_containers
, list_images
, etc.).Q: Which Containerd socket does the server connect to?
A: By default it connects to unix:///run/containerd/containerd.sock
. Change the endpoint in the source if needed.
Q: Do I need to configure any environment variables? A: No additional environment variables are required beyond having Containerd running.
Q: Can I customize the MCP server configuration? A: Current versions use hard‑coded defaults; future releases will add configurable connection parameters.
Q: How do I interact with the server?
A: Use the simple-chat-client
example from the Rust SDK, which sends natural‑language prompts and receives tool‑based responses.
Q: What license is the project under? A: Apache-2.0
This is an MCP server implemented using the RMCP (Rust Model Context Protocol) library for operating Containerd's CRI interfaces.
cargo build --release
cargo run --release
By default, the service will connect to the unix:///run/containerd/containerd.sock
endpoint.
The simple-chat-client allows you to interact with the MCP Containerd service: simple-chat-client has moved to simple-chat-client
Example interaction:
> please give me a list of containers
AI: Listing containers...
Tool: list_containers
Result: {"containers":[...]}
> please give me a list of images
AI: Here are the images in your containerd:
Tool: list_images
Result: {"images":[...]}
The MCP server includes the following main components:
version
service: Provides CRI version informationruntime
service: Provides container and Pod runtime operationsimage
service: Provides container image operationsCurrently using default configuration. Future versions will support customizing connection parameters through configuration files.
Apache-2.0
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.