by grafana
Provides Model Context Protocol endpoints that enable AI assistants to query and analyze distributed tracing data stored in Grafana Tempo, supporting both stdin/stdout communication and an HTTP SSE interface.
Tempo MCP Server implements the Model Context Protocol to expose Tempo tracing data as AI‑usable tools. It allows AI agents (e.g., Claude Desktop, Cursor) to run trace queries, retrieve results, and incorporate them into workflows.
go build -o tempo-mcp-server ./cmd/server
./tempo-mcp-server
OR run directly:
go run ./cmd/server
SSE_PORT
):
http://localhost:8080/sse
http://localhost:8080/mcp
docker build -t tempo-mcp-server .
docker run -p 8080:8080 --rm -i tempo-mcp-server
tempo_query
tool with required query
parameter and optional filters (url
, start
, end
, limit
, auth options).TEMPO_URL
, SSE_PORT
).run-client.sh
) for quick testing.Q: Which port does the server listen on?
A: Default is 8080; change with the SSE_PORT
environment variable.
Q: How does authentication work for Tempo queries?
A: Provide username
/password
for basic auth, or a token
for Bearer authentication. These can be passed in the tool request or set via environment variables.
Q: Can I run the server without Docker? A: Yes, compile with Go and execute the binary directly.
Q: What formats are returned by the tempo_query
tool?
A: The tool returns JSON‑encoded trace metadata limited by the limit
parameter.
Q: Is there support for multiple MCP tools?
A: Currently only tempo_query
is shipped, but the architecture allows adding more handlers under internal/handlers
.
A Go-based server implementation for the Model Context Protocol (MCP) with Grafana Tempo integration.
This MCP server allows AI assistants to query and analyze distributed tracing data from Grafana Tempo. It follows the Model Context Protocol to provide tool definitions that can be used by compatible AI clients such as Claude Desktop.
Build and run the server:
# Build the server
go build -o tempo-mcp-server ./cmd/server
# Run the server
./tempo-mcp-server
Or run directly with Go:
go run ./cmd/server
The server now supports two modes of communication:
The default port for the HTTP server is 8080, but can be configured using the SSE_PORT
environment variable.
When running in HTTP mode, the server exposes the following endpoints:
http://localhost:8080/sse
- For real-time event streaminghttp://localhost:8080/mcp
- For MCP protocol messagingYou can build and run the MCP server using Docker:
# Build the Docker image
docker build -t tempo-mcp-server .
# Run the server
docker run -p 8080:8080 --rm -i tempo-mcp-server
Alternatively, you can use Docker Compose for a complete test environment:
# Build and run with Docker Compose
docker-compose up --build
.
├── cmd/
│ ├── server/ # MCP server implementation
│ └── client/ # Client for testing the MCP server
├── internal/
│ └── handlers/ # Tool handlers
├── pkg/
│ └── utils/ # Utility functions and shared code
└── go.mod # Go module definition
The Tempo MCP Server implements the Model Context Protocol (MCP) and provides the following tools:
The tempo_query
tool allows you to query Grafana Tempo trace data:
query
: Tempo query string (e.g., {service.name="frontend"}
, {duration>1s}
)url
: The Tempo server URL (default: from TEMPO_URL environment variable or http://localhost:3200)start
: Start time for the query (default: 1h ago)end
: End time for the query (default: now)limit
: Maximum number of traces to return (default: 20)username
: Username for basic authentication (optional)password
: Password for basic authentication (optional)token
: Bearer token for authentication (optional)The Tempo query tool supports the following environment variables:
TEMPO_URL
: Default Tempo server URL to use if not specified in the requestSSE_PORT
: Port for the HTTP/SSE server (default: 8080)./run-client.sh tempo_query "{resource.service.name=\\\"example-service\\\"}"
You can use this MCP server with Claude Desktop to add Tempo query tools. Follow these steps:
Example Claude Desktop configuration:
{
"mcpServers": {
"temposerver": {
"command": "path/to/tempo-mcp-server",
"args": [],
"env": {
"TEMPO_URL": "http://localhost:3200"
},
"disabled": false,
"autoApprove": ["tempo_query"]
}
}
}
For Docker:
{
"mcpServers": {
"temposerver": {
"command": "docker",
"args": ["run", "--rm", "-i", "-e", "TEMPO_URL=http://host.docker.internal:3200", "tempo-mcp-server"],
"disabled": false,
"autoApprove": ["tempo_query"]
}
}
}
The Claude Desktop configuration file is located at:
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%\Claude\claude_desktop_config.json
~/.config/Claude/claude_desktop_config.json
You can also integrate the Tempo MCP server with the Cursor editor. To do this, add the following configuration to your Cursor settings:
{
"mcpServers": {
"tempo-mcp-server": {
"command": "docker",
"args": ["run", "--rm", "-i", "-e", "TEMPO_URL=http://host.docker.internal:3200", "tempo-mcp-server:latest"]
}
}
}
To use the Tempo MCP server with n8n, you can connect to it using the MCP Client Tool node:
Add an MCP Client Tool node to your n8n workflow
Configure the node with these parameters:
http://your-server-address:8080/sse
(replace with your actual server address)Connect the MCP Client Tool node to an AI Agent node that will use the Tempo querying capabilities
Example workflow: Trigger → MCP Client Tool (Tempo server) → AI Agent (Claude)
Once configured, you can use the tools in Claude with queries like:
{duration>1s}
"{service.name=\"frontend\"}
"{http.status_code=500}
"This project is licensed under the MIT License.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by Arize-ai
Open-source AI observability platform enabling tracing, evaluation, dataset versioning, experiment tracking, prompt management, and interactive playground for LLM applications.
by grafana
Provides programmatic access to a Grafana instance and its surrounding ecosystem through the Model Context Protocol, enabling AI assistants and other clients to query and manipulate dashboards, datasources, alerts, incidents, on‑call schedules, and more.
by dynatrace-oss
Provides a local server that enables real‑time interaction with the Dynatrace observability platform, exposing tools for querying data, retrieving problems, sending Slack notifications, and integrating AI assistance.
by VictoriaMetrics-Community
Provides a Model Context Protocol server exposing read‑only VictoriaMetrics APIs, enabling seamless monitoring, observability, and automation through AI‑driven assistants.
by GeLi2001
Enables interaction with the Datadog API through a Model Context Protocol server, providing access to monitors, dashboards, metrics, logs, events, and incident data.
by QAInsights
Execute JMeter test plans through Model Context Protocol clients, capture console output, generate HTML dashboards, and automatically analyze JTL results to surface performance metrics, bottlenecks, and actionable recommendations.
by grafana
Provides a Model Context Protocol (MCP) server that enables AI agents to query Grafana Loki log data via stdin/stdout or Server‑Sent Events, supporting both local binary execution and containerized deployment.
by TocharianOU
Provides a Model Context Protocol (MCP) server that enables MCP‑compatible clients to access, search, and manage Kibana APIs using natural language or programmatic requests.
by MindscapeHQ
Provides comprehensive access to Raygun's API V3 endpoints for crash reporting and real user monitoring via the Model Context Protocol.