by playcanvas
Enables AI‑driven automation of the PlayCanvas Editor using language‑model commands through an MCP server.
Editor MCP Server provides a bridge between large language models (LLMs) such as Anthropic Claude and the PlayCanvas Editor, allowing LLMs to issue commands that manipulate entities, assets, scenes, and the PlayCanvas Store directly from a chat interface.
npm install in the repository.npx tsx /path/to/mcp-editor/src/server.ts). The server listens on the port specified in the config (default 52000).mcp.json/claude_desktop_config.json template.chrome://extensions/, load the extensions folder, then open the PlayCanvas Editor and click CONNECT in the extension popup.list_entities, create_assets).Q: Which LLM works best with this server? A: Anthropic Claude (Pro tier) provides sufficient context length; Claude Desktop tends to be more reliable than Cursor.
Q: Can I run multiple editor instances simultaneously? A: No. Only one PlayCanvas Editor instance can be connected to the MCP server at a time.
Q: Do I need a paid Claude account? A: The free tier often lacks enough chat context for reliable operation, so a Pro account is strongly recommended.
Q: How do I change the server port?
A: Edit the PORT value in the MCP config JSON (env section) and restart the server.
Q: What if I want to use a different LLM? A: The server itself is LLM‑agnostic; you only need to ensure the client (Claude Desktop, Cursor, etc.) can forward tool calls to the MCP server.
██████╗ ██╗ █████╗ ██╗ ██╗ ██████╗ █████╗ ███╗ ██╗██╗ ██╗ █████╗ ███████╗
██╔══██╗██║ ██╔══██╗╚██╗ ██╔╝██╔════╝██╔══██╗████╗ ██║██║ ██║██╔══██╗██╔════╝
██████╔╝██║ ███████║ ╚████╔╝ ██║ ███████║██╔██╗ ██║██║ ██║███████║███████╗
██╔═══╝ ██║ ██╔══██║ ╚██╔╝ ██║ ██╔══██║██║╚██╗██║╚██╗ ██╔╝██╔══██║╚════██║
██║ ███████╗██║ ██║ ██║ ╚██████╗██║ ██║██║ ╚████║ ╚████╔╝ ██║ ██║███████║
╚═╝ ╚══════╝╚═╝ ╚═╝ ╚═╝ ╚═════╝╚═╝ ╚═╝╚═╝ ╚═══╝ ╚═══╝ ╚═╝ ╚═╝╚══════╝
███╗ ███╗ ██████╗██████╗ ███████╗███████╗██████╗ ██╗ ██╗███████╗██████╗
████╗ ████║██╔════╝██╔══██╗ ██╔════╝██╔════╝██╔══██╗██║ ██║██╔════╝██╔══██╗
██╔████╔██║██║ ██████╔╝ ███████╗█████╗ ██████╔╝██║ ██║█████╗ ██████╔╝
██║╚██╔╝██║██║ ██╔═══╝ ╚════██║██╔══╝ ██╔══██╗╚██╗ ██╔╝██╔══╝ ██╔══██╗
██║ ╚═╝ ██║╚██████╗██║ ███████║███████╗██║ ██║ ╚████╔╝ ███████╗██║ ██║
╚═╝ ╚═╝ ╚═════╝╚═╝ ╚══════╝╚══════╝╚═╝ ╚═╝ ╚═══╝ ╚══════╝╚═╝ ╚═╝
An MCP Server for automating the PlayCanvas Editor using an LLM.
[!IMPORTANT]
At the moment, the MCP Server needs to be driven by Anthropic's Claude. Our experience shows that the free tier for Claude does not deliver a big enough chat context to operate the MCP Server reliably. Therefore, we strongly recommend subscribing to a Pro Claude account.
list_entitiescreate_entitiesdelete_entitiesduplicate_entitiesmodify_entitiesreparent_entityadd_componentsremove_componentsadd_script_component_scriptlist_assetscreate_assetsdelete_assetsinstantiate_template_assetsset_script_textscript_parseset_material_diffusequery_scene_settingsmodify_scene_settingsstore_searchstore_getstore_downloadRun npm install to install all dependencies.
chrome://extensions/ and enable Developer modeLoad unpacked and select the extensions folderThe MCP Server can be driven by Cursor or Claude Desktop.
[!TIP]
We have found Claude Desktop to be generally more reliable.
Claude > Settings.Developer and then Edit Config.claude_desktop_config.json, your MCP Config JSON file.File > Preferences > Cursor Settings.+ Add new global MCP server.mcp.json, your MCP Config JSON file.[!TIP]
Also inCursor Settings, selectFeaturesand scroll to theChatsection. ActivateEnable auto-run modeto allow the LLM to run MCP tools without requiring constant authorization. You do this at your own risk (but we prefer it)!
[!IMPORTANT]
In Cursor, ensure you haveAgentselected.AskandEditmodes will not recognize the MCP Server.
This is how your config should look:
Windows
{
"mcpServers": {
"playcanvas": {
"command": "cmd",
"args": [
"/c",
"npx",
"tsx",
"C:\\path\\to\\mcp-editor\\src\\server.ts"
],
"env": {
"PORT": "52000"
}
}
}
}
macOS
{
"mcpServers": {
"playcanvas": {
"command": "npx",
"args": [
"tsx",
"/path/to/mcp-editor/src/server.ts"
],
"env": {
"PORT": "52000"
}
}
}
}
The PlayCanvas Editor does not connect to the MCP Server automatically. To connect:
CONNECT (the port number should match what is set in your MCP Config JSON File).[!NOTE] You can currently only connect one instance of the PlayCanvas Editor to the MCP Server at any one time.
You should now be able to issue commands in Claude Desktop or Cursor.
Please log in to share your review and rating for this MCP.
{
"mcpServers": {
"playcanvas": {
"command": "npx",
"args": [
"tsx",
"/path/to/mcp-editor/src/server.ts"
],
"env": {
"PORT": "52000"
}
}
}
}claude mcp add playcanvas npx tsx /path/to/mcp-editor/src/server.tsExplore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.