by Mng-dev-ai
Provides a self‑hosted Claude Code workspace that integrates multi‑provider routing, sandboxed code execution, and a full web‑based IDE.
Claudex delivers a self‑hosted Claude Code environment where developers can write, run, and debug code through a browser‑based IDE. It unifies multiple LLM providers (Anthropic, OpenAI, OpenRouter, GitHub Copilot, or custom Anthropic‑compatible endpoints) behind an Anthropic‑compatible bridge while preserving Claude Code‑native tooling, permissions, and MCP orchestration.
Web mode (Docker Compose)
git clone https://github.com/Mng-dev-ai/claudex.git
cd claudex
docker compose -p claudex-web -f docker-compose.yml up -d
Access the UI at http://localhost:3000. Backend API runs on 8080 with PostgreSQL and Redis containers.
Desktop mode (macOS, Tauri)
localhost:8081 and uses SQLite for storage.cd frontend
npm install
npm run desktop:dev
Workspace management
.claude configuration; chats share the same sandbox.claude-agent-sdk.Q: How do I switch the LLM provider for an existing workspace? A: Open Settings → Providers, select the desired provider, and the workspace will continue using the same sandbox and chat history.
Q: Can I run Claudex without Docker? A: Yes. Desktop mode runs a bundled Python side‑car and uses a host‑based sandbox; no Docker is required.
Q: What runtimes are supported inside the sandbox? A: Any tool installable in the container/host environment (Node, Python, Rust, etc.) can be used through Claude Code’s tool execution flow.
Q: How are custom MCP servers added? A: Via the Extensions UI – upload a Docker image or point to a running service that implements the MCP protocol; Claudex will register it as a skill/agent.
Q: Is data persisted between restarts? A: In web mode, PostgreSQL stores workspace metadata and chat history; Redis caches live state. Desktop mode persists everything in a local SQLite file.
Self-hosted Claude Code workspace with multi-provider routing, sandboxed execution, and a full web IDE.
Note: Claudex is under active development. Expect breaking changes between releases.
Join the Discord server.
React/Vite Frontend
-> FastAPI Backend
-> PostgreSQL + Redis (web/docker mode)
-> SQLite + in-memory cache/pubsub (desktop mode)
-> Sandbox runtime (Docker/Host)
-> Claude Code CLI + claude-agent-sdk
Claudex runs chats through claude-agent-sdk, which drives the Claude Code CLI in the selected sandbox. This keeps Claude Code-native behavior for tools, session flow, permission modes, and MCP orchestration.
For OpenAI, OpenRouter, and Copilot providers, Claudex starts anthropic-bridge inside the sandbox and routes Claude Code requests through:
ANTHROPIC_BASE_URL=http://127.0.0.1:3456OPENROUTER_API_KEY and GITHUB_COPILOT_TOKENopenai/gpt-5.2-codex, openrouter/moonshotai/kimi-k2.5, copilot/gpt-5.2-codexClaudex UI
-> Claude Agent SDK + Claude Code CLI
-> Anthropic-compatible request shape
-> Anthropic Bridge (OpenAI/OpenRouter/Copilot)
-> Target provider model
For Anthropic providers, Claudex uses your Claude auth token directly. For custom providers, Claudex calls your configured Anthropic-compatible base_url.
claude-agent-sdkopenai/*, openrouter/*, copilot/*)Workspaces are the top-level organizational unit. Each workspace owns a dedicated sandbox and groups all related chats under one project context.
Each workspace gets its own sandbox instance (Docker container or host process). Chats within a workspace share the same filesystem, installed tools, and .claude configuration. Switching between workspaces switches the entire execution environment.
When creating a workspace you can override the default sandbox provider (Docker or Host). The provider is locked at creation time — all chats in that workspace use the same provider.
git clone https://github.com/Mng-dev-ai/claudex.git
cd claudex
docker compose -p claudex-web -f docker-compose.yml up -d
Open http://localhost:3000.
docker compose -p claudex-web -f docker-compose.yml down
docker compose -p claudex-web -f docker-compose.yml logs -f
Desktop mode uses Tauri with a bundled Python backend sidecar on localhost:8081, with local SQLite storage.
When running in desktop mode:
8081Tauri Desktop App
-> React frontend (.env.desktop)
-> bundled backend sidecar (localhost:8081)
-> local SQLite database
Requirements:
Dev workflow:
cd frontend
npm install
npm run desktop:dev
Build (unsigned dev):
cd frontend
npm run desktop:build
App bundle output:
frontend/src-tauri/target/release/bundle/macos/Claudex.appDesktop troubleshooting:
8081 if already in useConfigure providers in Settings -> Providers.
anthropic: paste token from claude setup-tokenopenai: authenticate with OpenAI device flow in UIcopilot: authenticate with GitHub device flow in UIopenrouter: add OpenRouter API key and model IDscustom: set Anthropic-compatible base_url, token, and model IDsgpt-5.2-codex, gpt-5.2, gpt-5.3-codexmoonshotai/kimi-k2.5, minimax/minimax-m2.1, google/gemini-3-pro-previewGLM-5, M2.5, or private org-specific endpoints (depends on your backend compatibility)Switching providers within a workspace does not require a new workflow:
.claude resources (skills, agents, commands)This is the main value of using Claude Code as the harness while changing inference providers behind Anthropic Bridge.
3000808054326379590060808765GET /healthGET /api/v1/readyz
/ and API under /api/*

Apache 2.0. See LICENSE.
Contributions are welcome. Please open an issue first to discuss what you would like to change, then submit a pull request.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.