by universal-mcp
Standardized interface for interacting with Canva's tools and services via a unified API, built on the Universal MCP framework.
Provides a unified API layer that lets developers call Canva's design and media capabilities through the Model Context Protocol, ensuring compatibility with other MCP‑compliant services.
uv
package manager (pip install uv
).uv sync
to create a virtual environment and pull all libraries from pyproject.toml
.source .venv/bin/activate
(Linux/macOS) or .venv\Scripts\Activate
(Windows PowerShell).mcp dev src/universal_mcp_canva/server.py
. Note the displayed address and port.mcp install src/universal_mcp_canva/server.py
.https://agentr.dev/apps
and interact through the provided API endpoints.src/universal_mcp_canva/README.md
.uv
and the MCP CLI.Q: Do I need a Canva account or API key?
A: Yes. The server expects relevant Canva credentials (e.g., API token) to be set in the .env
file before starting the inspector.
Q: Can I run this on Windows?
A: Absolutely. Just follow the Windows‑specific activation command (.venv\Scripts\Activate
) and the rest of the steps are identical.
Q: Is Docker support available? A: The repository does not include a Dockerfile, but the server can be containerized by installing the same Python dependencies inside a base image.
Q: How do I add new tools?
A: Extend src/universal_mcp_canva/app.py
with additional functions and document them in the tools README.
Q: Where do I report bugs or request features?
A: Open an issue on the GitHub repository https://github.com/universal-mcp/canva
.
This repository contains an implementation of an Canva Universal MCP (Model Context Protocol) server. It provides a standardized interface for interacting with Canva's tools and services through a unified API.
The server is built using the Universal MCP framework.
This implementation follows the MCP specification, ensuring compatibility with other MCP-compliant services and tools.
You can start using Canva directly from agentr.dev. Visit agentr.dev/apps and enable Canva.
If you have not used universal mcp before follow the setup instructions at agentr.dev/quickstart
The full list of available tools is at ./src/universal_mcp_canva/README.md
Ensure you have the following before you begin:
pip install uv
)Follow the steps below to set up your development environment:
Sync Project Dependencies
uv sync
This installs all dependencies from pyproject.toml
into a local virtual environment (.venv
).
Activate the Virtual Environment
For Linux/macOS:
source .venv/bin/activate
For Windows (PowerShell):
.venv\Scripts\Activate
Start the MCP Inspector
mcp dev src/universal_mcp_canva/server.py
This will start the MCP inspector. Make note of the address and port shown in the console output.
Install the Application
mcp install src/universal_mcp_canva/server.py
.
├── src/
│ └── universal_mcp_canva/
│ ├── __init__.py # Package initializer
│ ├── server.py # Server entry point
│ ├── app.py # Application tools
│ └── README.md # List of application tools
├── tests/ # Test suite
├── .env # Environment variables for local development
├── pyproject.toml # Project configuration
└── README.md # This file
This project is licensed under the MIT License.
Generated with MCP CLI — Happy coding! 🚀
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.