by opensumi
A framework for rapidly building AI‑native IDE products, offering extensible editor components, Electron desktop support, and MCP client integration.
OpenSumi provides a modular, extensible platform to create IDE‑style applications that are AI‑first and AI‑native. It bundles core editor services, plugin architecture, and support for Model Context Protocol (MCP) tools, enabling developers to focus on product features rather than infrastructure.
# Install dependencies
yarn install
# Initialize the workspace
yarn run init
# (Optional) download built‑in extensions
yarn run download-extension
# Start the development server
yarn run start
By default the project opens the tools/workspace
folder. To launch a different workspace you can set the MY_WORKSPACE
environment variable:
MY_WORKSPACE=/path/to/your/project yarn run start
Refer to the CONTRIBUTING.md
for environment‑setup details.
Q: Do I need to run an MCP server separately? A: The framework includes an MCP client; you can connect it to any MCP server you operate. The repository does not ship a server, so you need to provide your own or use a hosted solution.
Q: Can I use OpenSumi in a pure web browser?
A: Yes. The codeblitz
and ide-startup-lite
templates demonstrate web‑only deployments.
Q: How do I add my own extensions?
A: Place the extension code under the extensions
directory and reference it in the workspace configuration. The CLI yarn run download-extension
can fetch pre‑built extensions from the marketplace.
Q: Is TypeScript required for developing plugins? A: While TypeScript is the primary language used by the core, JavaScript plugins are also supported.
Q: Where can I find documentation?
A: Full documentation is hosted at https://opensumi.com and the repo includes a docs
folder with API references.
Changelog · Report Bug · Request Feature · English · 中文
Here you can find some of our example projects and templates:
$ yarn install
$ yarn run init
$ yarn run download-extension # Optional
$ yarn run start
By default, the tools/workspace
folder in the project would be opened, or you can run the project by specifying the directory in the following way:
$ MY_WORKSPACE={local_path} yarn run start
Usually, you may still encounter some system-level environment dependencies. You can visit Development Environment Preparation to see how to install the corresponding environment dependencies.
For complete documentation: opensumi.com
You can see all the releasenotes and breaking changes here: CHANGELOG.md.
Read through our Contributing Guide to learn about our submission process, coding rules and more.
Want to report a bug, contribute some code, or improve documentation? Excellent! Read up on our Contributing Guidelines for contributing and then check out one of our issues labeled as help wanted or good first issue.
Go to our issues or discussions to create a topic, it will be resolved as soon as we can.
Let's build a better OpenSumi together.
We warmly invite contributions from everyone. Before you get started, please take a moment to review our Contributing Guide. Feel free to share your ideas through Pull Requests or GitHub Issues.
Copyright (c) 2019-present Alibaba Group Holding Limited, Ant Group Co. Ltd.
Licensed under the MIT license.
This project contains various third-party code under other open source licenses.
See the NOTICE.md file for more information.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by GLips
Provides Figma layout and styling information to AI coding agents, enabling one‑shot implementation of designs in any framework.
by idosal
Provides a remote Model Context Protocol server that transforms any public GitHub repository into an up‑to‑date documentation hub, enabling AI assistants to fetch live code and docs, dramatically reducing hallucinations.