by universal-mcp
Provides a standardized API for interacting with Coda's tools and services via the Universal MCP framework.
The project implements a Coda Universal MCP server that exposes Coda's tools and services through a unified API, adhering to the MCP specification for seamless compatibility with other MCP‑compliant services.
uv
package manager.uv sync
creates a virtual environment and installs all required packages.source .venv/bin/activate
.venv\Scripts\Activate
mcp dev src/universal_mcp_coda/server.py
– note the address and port displayed.mcp install src/universal_mcp_coda/server.py
.uv
and the mcp
CLI.src/universal_mcp_coda/README.md
.Q: Do I need Docker to run the server?
A: No, the server runs directly on Python 3.11+ with the uv
environment.
Q: Which command starts the server in production?
A: The README shows the development command (mcp dev
). For production you would typically run the server script (python -m universal_mcp_coda.server
) within your chosen process manager.
Q: Can I add my own tools?
A: Yes, add modules under src/universal_mcp_coda/
and expose them through the server’s application layer.
Q: Is there a public endpoint for the server? A: The server is intended for local or self‑hosted deployment; you can expose it behind a reverse proxy if needed.
Q: Where are the environment variables defined?
A: See the .env
file for local development configuration.
This repository contains an implementation of an Coda Universal MCP (Model Context Protocol) server. It provides a standardized interface for interacting with Coda's tools and services through a unified API.
The server is built using the Universal MCP framework.
This implementation follows the MCP specification, ensuring compatibility with other MCP-compliant services and tools.
You can start using Coda directly from agentr.dev. Visit agentr.dev/apps and enable Coda.
If you have not used universal mcp before follow the setup instructions at agentr.dev/quickstart
The full list of available tools is at ./src/universal_mcp_coda/README.md
Ensure you have the following before you begin:
pip install uv
)Follow the steps below to set up your development environment:
Sync Project Dependencies
uv sync
This installs all dependencies from pyproject.toml
into a local virtual environment (.venv
).
Activate the Virtual Environment
For Linux/macOS:
source .venv/bin/activate
For Windows (PowerShell):
.venv\Scripts\Activate
Start the MCP Inspector
mcp dev src/universal_mcp_coda/server.py
This will start the MCP inspector. Make note of the address and port shown in the console output.
Install the Application
mcp install src/universal_mcp_coda/server.py
.
├── src/
│ └── universal_mcp_coda/
│ ├── __init__.py # Package initializer
│ ├── server.py # Server entry point
│ ├── app.py # Application tools
│ └── README.md # List of application tools
├── tests/ # Test suite
├── .env # Environment variables for local development
├── pyproject.toml # Project configuration
└── README.md # This file
This project is licensed under the MIT License.
Generated with MCP CLI — Happy coding! 🚀
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.