by MindscapeHQ
Provides comprehensive access to Raygun's API V3 endpoints for crash reporting and real user monitoring via the Model Context Protocol.
Enables interaction with Raygun's Crash Reporting and Real User Monitoring APIs through a Model Context Protocol (MCP) server, exposing a rich set of operations for applications, errors, deployments, sessions, performance data, source maps, and team management.
npx
.RAYGUN_PAT_TOKEN
environment variable (your Raygun Personal Access Token). Optionally set SOURCEMAP_ALLOWED_DIRS
to restrict source‑map file access.npx -y @raygun.io/mcp-server-raygun
).npm run inspector
launches the MCP Inspector UI.What authentication is required?
Provide a Raygun Personal Access Token via the RAYGUN_PAT_TOKEN
environment variable.
Do I need to install the server globally?
No. The recommended approach is to run it with npx -y @raygun.io/mcp-server-raygun
, which fetches the package at runtime.
Can I restrict source‑map operations?
Yes. Set SOURCEMAP_ALLOWED_DIRS
to a comma‑separated list of permitted directories.
How do I debug communication issues?
Use the provided MCP Inspector (npm run inspector
) which offers a web UI for inspecting stdio traffic.
Is the server compatible with non‑Claude clients? Absolutely. Any MCP‑compatible client can communicate over stdio with the server.
MCP Server for Raygun's API V3 endpoints for interacting with your Crash Reporting and Real User Monitoring applications. This server provides comprehensive access to Raygun's API features through the Model Context Protocol.
list_applications
- List all applications under your accountget_application
- Get application details by identifierget_application_by_api_key
- Get application details by API keyregenerate_application_api_key
- Generate a new API key for an applicationlist_error_groups
- List error groups for an applicationget_error_group
- Get detailed information about an error groupresolve_error_group
- Set error group status to resolvedactivate_error_group
- Set error group status to activeignore_error_group
- Set error group status to ignoredpermanently_ignore_error_group
- Set error group status to permanently ignoredlist_deployments
- List deployments for an applicationget_deployment
- Get deployment details by identifierdelete_deployment
- Remove a deploymentupdate_deployment
- Update deployment informationreprocess_deployment_commits
- Reprocess deployment commit datalist_customers
- List customers for an applicationlist_sessions
- List user sessions for an applicationget_session
- Get detailed session informationlist_pages
- List monitored pages for an applicationget_page_metrics_time_series
- Get time-series performance metricsget_page_metrics_histogram
- Get histogram of performance metricsget_error_metrics_time_series
- Get time-series error metricslist_source_maps
- List source maps for an applicationget_source_map
- Get source map detailsupdate_source_map
- Update source map informationdelete_source_map
- Remove a source mapupload_source_map
- Upload a new source mapdelete_all_source_maps
- Remove all source mapslist_invitations
- List pending team invitationssend_invitation
- Send a new team invitationget_invitation
- Get invitation detailsrevoke_invitation
- Revoke a pending invitationThe server requires the following environment variables:
RAYGUN_PAT_TOKEN
(required): Your Raygun PAT tokenSOURCEMAP_ALLOWED_DIRS
(optional): Comma-separated list of directories allowed for source map operationsAdd to your claude_desktop_config.json
:
{
"mcpServers": {
"raygun": {
"command": "npx",
"args": ["-y", "@raygun.io/mcp-server-raygun"],
"env": {
"RAYGUN_PAT_TOKEN": "your-pat-token-here"
}
}
}
}
Install dependencies:
npm install
Build the server:
npm run build
For development with auto-rebuild:
npm run watch
To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"raygun": {
"command": "/path/to/server-raygun/build/index.js",
"env": {
"RAYGUN_PAT_TOKEN": "your-pat-token-ken"
}
}
}
}
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "raygun": { "command": "npx", "args": [ "-y", "@raygun.io/mcp-server-raygun" ], "env": { "RAYGUN_PAT_TOKEN": "<YOUR_API_KEY>" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by Arize-ai
Open-source AI observability platform enabling tracing, evaluation, dataset versioning, experiment tracking, prompt management, and interactive playground for LLM applications.
by grafana
Provides programmatic access to a Grafana instance and its surrounding ecosystem through the Model Context Protocol, enabling AI assistants and other clients to query and manipulate dashboards, datasources, alerts, incidents, on‑call schedules, and more.
by dynatrace-oss
Provides a local server that enables real‑time interaction with the Dynatrace observability platform, exposing tools for querying data, retrieving problems, sending Slack notifications, and integrating AI assistance.
by VictoriaMetrics-Community
Provides a Model Context Protocol server exposing read‑only VictoriaMetrics APIs, enabling seamless monitoring, observability, and automation through AI‑driven assistants.
by GeLi2001
Enables interaction with the Datadog API through a Model Context Protocol server, providing access to monitors, dashboards, metrics, logs, events, and incident data.
by QAInsights
Execute JMeter test plans through Model Context Protocol clients, capture console output, generate HTML dashboards, and automatically analyze JTL results to surface performance metrics, bottlenecks, and actionable recommendations.
by grafana
Provides a Model Context Protocol (MCP) server that enables AI agents to query Grafana Loki log data via stdin/stdout or Server‑Sent Events, supporting both local binary execution and containerized deployment.
by TocharianOU
Provides a Model Context Protocol (MCP) server that enables MCP‑compatible clients to access, search, and manage Kibana APIs using natural language or programmatic requests.
by grafana
Provides Model Context Protocol endpoints that enable AI assistants to query and analyze distributed tracing data stored in Grafana Tempo, supporting both stdin/stdout communication and an HTTP SSE interface.