by OpsLevel
Provides AIs with read‑only access to OpsLevel resources through the Model Context Protocol, enabling AI assistants to query actions, components, teams, and other platform data.
OpsLevel MCP Server enables AI tools to interact with an OpsLevel account by exposing a read‑only Model Context Protocol (MCP) interface. It surfaces key OpsLevel resources—such as actions, campaigns, checks, components, documentation, domains, infrastructure, repositories, systems, teams, and users—so AI assistants can retrieve accurate, up‑to‑date information.
brew install opslevel/tap/opslevel-mcpdocker pull public.ecr.aws/opslevel/mcp:latestOPSLEVEL_API_TOKEN environment variable. Obtain the token from the OpsLevel UI under API Tokens.opslevel-mcp) and the required environment variable.Q: Do I need write permissions for the API token?
A: No. The server only performs read‑only operations, so a token with read access is sufficient.
Q: Can I run the server locally?
A: Yes. Install via Homebrew or download the binary and execute opslevel-mcp.
Q: How do I use the server with Docker?
A: Replace the command in your MCP config with a Docker invocation, e.g.:
"command": "docker",
"args": ["run", "-i", "--rm", "-e", "OPSLEVEL_API_TOKEN", "public.ecr.aws/opslevel/mcp:latest"]
Q: Which AI tools are supported?
A: Claude Desktop, VS Code Copilot, Cursor, Warp, and Windsurf all have built‑in MCP support.
Q: Is there a way to limit which resources are exposed?
A: Not currently; the server exposes all read‑only endpoints defined by OpsLevel.
This MCP (Model Context Protocol) server provides AIs with tools to interact with your OpsLevel account.
Currently, the MCP server only uses read-only access to your OpsLevel account and can read data from the following resources:
brew install opslevel/tap/opslevel-mcpdocker pull public.ecr.aws/opslevel/mcp:latest${HOME}/Library/Application\ Support/Claude/claude_desktop_config.json%APPDATA%\Claude\claude_desktop_config.json{
"mcpServers": {
"opslevel": {
"command": "opslevel-mcp",
"env": {
"OPSLEVEL_API_TOKEN": "XXXXXXX"
}
}
}
}
${HOME}/Library/Application\\ Support/Code/User/settings.json{
"chat.agent.enabled": true,
"chat.mcp.discovery.enabled": true,
"mcp": {
"inputs": [
{
"type": "promptString",
"id": "opslevel_token",
"description": "OpsLevel API Token",
"password": true
}
],
"servers": {
"opslevel": {
"type": "stdio",
"command": "opslevel-mcp",
"env": {
"OPSLEVEL_API_TOKEN": "${input:opslevel_token}"
}
}
}
}
}
{
"mcpServers": {
"opslevel": {
"command": "opslevel-mcp",
"env": {
"OPSLEVEL_API_TOKEN": "XXXXXX"
}
}
}
}
{
"opslevel": {
"command": "opslevel-mcp",
"args": [],
"env": {
"OPSLEVEL_API_TOKEN": "XXXXXX"
},
"start_on_launch": true
}
}
{
"mcpServers": {
"opslevel": {
"command": "opslevel-mcp",
"env": {
"OPSLEVEL_API_TOKEN": "XXXXXX"
}
}
}
}
If you didn't install the binary directly and instead pulled the docker image you'll need to adjust the above MCP configurations to support running the server via docker
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"OPSLEVEL_API_TOKEN",
"public.ecr.aws/opslevel/mcp:latest"
],
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.