by ofek
Extends the Model Context Protocol to serve any Python command‑line application, allowing CLI tools to be accessed programmatically through MCP.
PyCLI MCP provides an extensible MCP server that can wrap any Python CLI application—whether built with Click, Typer, or (experimentally) argparse—so the application can be invoked via the Model Context Protocol.
pip install pycli-mcp
.Q: Which CLI frameworks are supported? A: Click and Typer are fully supported; argparse has experimental support.
Q: How do I run the server in production?
A: Launch the Python entry script (e.g., python -m my_cli_server
) after installing the package. Use typical process managers (systemd, Docker, etc.) to keep it running.
Q: Is there a Docker image available? A: Not directly, but the package can be installed in any Python‑based Docker image.
Q: What Python versions are compatible? A: All versions listed on the PyPI badge (see repository).
CI/CD | |
Docs | |
Package | |
Meta |
This provides an extensible MCP server that is compatible with any Python command line application.
Supported frameworks:
pip install pycli-mcp
The documentation is made with Material for MkDocs and is hosted by GitHub Pages.
pycli-mcp
is distributed under the terms of the MIT license.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.