by jkosik
Provides a Go‑based MCP server that exposes Splunk data through STDIO and Server‑Sent Events, enabling LLMs to call tools for saved searches, alerts, indexes, and macros.
MCP Server for Splunk implements the Model Context Protocol (MCP) tooling for Splunk environments. It delivers a lightweight Go service that can be interacted with via STDIO (default) or an SSE‑based HTTP API, exposing a set of predefined MCP tools that query Splunk objects such as saved searches, alerts, fired alerts, indexes, and macros.
export SPLUNK_URL=https://your-splunk-instance:8089
export SPLUNK_TOKEN=your-splunk-token
echo '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}' |
go run cmd/mcp-server-splunk/main.go | jq
go run cmd/mcp-server-splunk/main.go -transport sse -port 3001
Then open a session with curl http://localhost:3001/sse and send JSON‑RPC messages to http://localhost:3001/message?sessionId=YOUR_SESSION_ID.docker build -t mcp-server-splunk .
echo '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}' |
docker run --rm -i -e SPLUNK_URL=$SPLUNK_URL -e SPLUNK_TOKEN=$SPLUNK_TOKEN mcp-server-splunk | jq
~/.cursor/mcp.json for STDIO or SSE, then use natural language prompts that trigger the MCP tools.list_splunk_saved_searcheslist_splunk_alertslist_splunk_fired_alertslist_splunk_indexeslist_splunk_macroscount, offset, and context‑specific filters (e.g., title, ss_name, earliest).internal/splunk/prompt.go demonstrates complex multi‑tool workflows.smithery.yaml enable one‑click deployment on Smithery platform.SPLUNK_URL (the Splunk REST API endpoint) and SPLUNK_TOKEN (a valid auth token).tools/list JSON‑RPC request via STDIO or SSE.count (max 100) and offset parameters; set them in the arguments field of the tools/call request.go build -o mcp-server-splunk cmd/mcp-server-splunk/main.go) or use the Docker image, then supervise the process with systemd, Docker‑compose, or Kubernetes.count at 100 results per request.title argument in list_splunk_alerts for a case‑insensitive substring match.A Go implementation of the MCP server for Splunk. Supports STDIO and SSE (Server-Sent Events HTTP API). Uses github.com/mark3labs/mcp-go SDK.
list_splunk_saved_searches
count (number, optional): Number of results to return (max 100, default 100)offset (number, optional): Offset for pagination (default 0)list_splunk_alerts
count (number, optional): Number of results to return (max 100, default 10)offset (number, optional): Offset for pagination (default 0)title (string, optional): Case-insensitive substring to filter alert titleslist_splunk_fired_alerts
count (number, optional): Number of results to return (max 100, default 10)offset (number, optional): Offset for pagination (default 0)ss_name (string, optional): Search name pattern to filter alerts (default "*")earliest (string, optional): Time range to look back (default "-24h")list_splunk_indexes
count (number, optional): Number of results to return (max 100, default 10)offset (number, optional): Offset for pagination (default 0)list_splunk_macros
count (number, optional): Number of results to return (max 100, default 10)offset (number, optional): Offset for pagination (default 0)internal/splunk/prompt.go implements an MCP Prompt to find Splunk alerts for a specific keyword (e.g. GitHub or OKTA) and instructs Cursor to utilise multiple MCP tools to review all Splunk alerts, indexes and macros first to provide the best answer.cmd/mcp/server/main.go implements MCP Resource in the form of local CSV file with Splunk related content, providing further context to the chat.export SPLUNK_URL=https://your-splunk-instance:8089
export SPLUNK_TOKEN=your-splunk-token
# List available tools
echo '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}' | go run cmd/mcp-server-splunk/main.go | jq
# Call list_splunk_saved_searches tool
echo '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"list_splunk_saved_searches","arguments":{}}}' | go run cmd/mcp-server-splunk/main.go | jq
export SPLUNK_URL=https://your-splunk-instance:8089
export SPLUNK_TOKEN=your-splunk-token
# Start the server
go run cmd/mcp-server-splunk/main.go -transport sse -port 3001
# Call the server and get Session ID from the output. Do not terminate the session.
curl http://localhost:3001/sse
# Keep session running and and use different terminal window for the final MCP call
curl -X POST "http://localhost:3001/message?sessionId=YOUR_SESSION_ID" \
-H "Content-Type: application/json" \
-d '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}' | jq
Dockerfile and smithery.yaml are used to support hosting this MCP server at [Smithery](https://smithery.ai/server/@jkosik/.
docker build -t mcp-server-splunk .
echo '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}' | \
docker run --rm -i \
-e SPLUNK_URL=https://your-splunk-instance:8089 \
-e SPLUNK_TOKEN=your-splunk-token \
mcp-server-splunk | jq
By configuring MCP Settings in Cursor, you can include remote data directly into the LLM context.

Integrate STDIO or SSE MCP Servers (see below) and use Cursor Chat. Cursor will automatically try to use MCP Tools, Prompts or Re Sample prompts:
How many MCP tools for Splunk are available?How many Splunk indexes do we have?Can you list first 5 Splunk macros including underlying queries?How many alers with "Alert_CRITICAL" in the name were fired in the last day?Read the MCP Resource "Data Dictionary" and find the contact person for the Splunk index XYZ.Build the server:
go build -o cmd/mcp-server-splunk/mcp-server-splunk cmd/mcp-server-splunk/main.go
Update ~/.cursor/mcp.json
{
"mcpServers": {
"splunk_stdio": {
"name": "Splunk MCP Server (STDIO)",
"description": "MCP server for Splunk integration",
"type": "stdio",
"command": "/Users/juraj/data/github.com/jkosik/mcp-server-splunk/cmd/mcp-server-splunk/mcp-server-splunk",
"env": {
"SPLUNK_URL": "https://your-splunk-instance:8089",
"SPLUNK_TOKEN": "your-splunk-token"
}
}
}
}
Start the server:
export SPLUNK_URL=https://your-splunk-instance:8089
export SPLUNK_TOKEN=your-splunk-token
# Start the server
go run cmd/mcp-server-splunk/main.go -transport sse -port 3001
Update ~/.cursor/mcp.json
{
"mcpServers": {
"splunk_sse": {
"name": "Splunk MCP Server (SSE)",
"description": "MCP server for Splunk integration (SSE mode)",
"type": "sse",
"url": "http://localhost:3001/sse"
}
}
}
Certified by MCP Review: https://mcpreview.com/mcp-servers/jkosik/mcp-server-splunk
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by netdata
Delivers real‑time, per‑second infrastructure monitoring with zero‑configuration agents, on‑edge machine‑learning anomaly detection, and built‑in dashboards.
by Arize-ai
Open-source AI observability platform enabling tracing, evaluation, dataset versioning, experiment tracking, prompt management, and interactive playground for LLM applications.
by msgbyte
Provides integrated website traffic analysis, uptime checking, and server health monitoring in a single self‑hosted platform.
by grafana
Provides programmatic access to a Grafana instance and its surrounding ecosystem through the Model Context Protocol, enabling AI assistants and other clients to query and manipulate dashboards, datasources, alerts, incidents, on‑call schedules, and more.
by dynatrace-oss
Provides a local server that enables real‑time interaction with the Dynatrace observability platform, exposing tools for querying data, retrieving problems, sending Slack notifications, and integrating AI assistance.
by pydantic
Provides tools to retrieve and query OpenTelemetry trace and metric data from Pydantic Logfire, allowing LLMs to analyze distributed traces and run arbitrary SQL queries against telemetry records.
by VictoriaMetrics-Community
Provides a Model Context Protocol server exposing read‑only VictoriaMetrics APIs, enabling seamless monitoring, observability, and automation through AI‑driven assistants.
by GeLi2001
Enables interaction with the Datadog API through a Model Context Protocol server, providing access to monitors, dashboards, metrics, logs, events, and incident data.
by last9
Provides AI agents with real‑time production context—including logs, metrics, traces, and alerts—through a Model Context Protocol server, enabling automatic code fixing and faster debugging.