by Roblox
Facilitates communication between Roblox Studio and AI assistants via a Model Context Protocol server, enabling tool execution such as model insertion and code running directly from prompts.
Enables Roblox Studio to interact with AI tools like Claude Desktop and Cursor. The server acts as a bridge, handling long‑poll requests from a Studio plugin and forwarding tool commands to the AI client through a standard protocol.
rbx-studio-mcp
executable (Windows) or the app bundle path (macOS).cargo run
. The command builds the server, registers Claude, and installs the Studio plugin.The MCP Studio plugin is ready for prompts.
.rmcp
server communicating with Claude via stdio.insert_model
and run_code
.Q: Will third‑party tools be able to modify my place? A: Yes. The MCP server grants connected tools read/write access to the open place. Review each tool’s privacy policy.
Q: Which operating systems are supported? A: Pre‑built binaries are provided for Windows and macOS. Building from source works on any platform with Rust.
Q: How do I restart the connection if it stops working? A: Close both Roblox Studio and the AI client, then reopen them. The plugin will re‑establish the long‑poll connection on launch.
Q: Can I use a different AI assistant? A: Any client that implements the Model Context Protocol can be added by editing the MCP client config.
This repository contains a reference implementation of the Model Context Protocol (MCP) that enables communication between Roblox Studio via a plugin and Claude Desktop or Cursor. It consists of the following Rust-based components, which communicate through internal shared objects.
axum
that a Studio plugin long polls.rmcp
server that talks to Claude via stdio
transport.When LLM requests to run a tool, the plugin will get a request through the long polling and post a response. It will cause responses to be sent to the Claude app.
Please note that this MCP server will be accessed by third-party tools, allowing them to modify and read the contents of your opened place. Third-party data handling and privacy practices are subject to their respective terms and conditions.
The setup process also contains a short plugin installation and Claude Desktop configuration script.
This MCP Server supports pretty much any MCP Client but will automatically set up only Claude Desktop and Cursor if found.
To set up automatically:
To set up manually add following to your MCP Client config:
{
"mcpServers": {
"Roblox Studio": {
"args": [
"--stdio"
],
"command": "Path-to-downloaded\\rbx-studio-mcp.exe"
}
}
}
On macOS the path would be something like "/Applications/RobloxStudioMCP.app/Contents/MacOS/rbx-studio-mcp"
if you move the app to the Applications directory.
To build and install the MCP reference implementation from this repository's source code:
cargo run
This command carries out the following actions:
After the command completes, the Studio MCP Server is installed and ready for your prompts from Claude Desktop.
To make sure everything is set up correctly, follow these steps:
The MCP Studio plugin is ready for prompts.
appears in the output.
Clicking on the plugin's icon toggles MCP communication with Claude Desktop on and off,
which you can also verify in the console output.insert_model
and run_code
).Note: You can fix common issues with setup by restarting Studio and Claude Desktop. Claude sometimes is hidden in the system tray, so ensure you've exited it completely.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.