by ChristianHinge
Enables AI assistants and applications to query, retrieve, and transfer DICOM data from PACS, VNA, and other medical imaging systems.
Dicom Mcp provides a Model Context Protocol (MCP) server that exposes a set of tools for interacting with DICOM nodes. It lets downstream agents search patient, study, series, and instance metadata, extract text from encapsulated PDF reports, and move whole series or studies to other DICOM destinations (e.g., AI segmentation services).
uv tool install dicom-mcp
or pip install dicom-mcp
.npx dicom-mcp /path/to/config.yaml
.query_patients
, move_series
, etc.) via the MCP transport (stdio, HTTP, etc.).query_patients
, query_studies
, query_series
, query_instances
with flexible criteria.extract_pdf_text_from_dicom
returns plain‑text from encapsulated PDFs.move_series
and move_study
forward images to configured remote nodes.pynetdicom
and PyPDF2
.docker-compose up -d
in the tests
folder) and execute pytest
from the project root.npx dicom-mcp …
.The dicom-mcp
server enables AI assistants to query, read, and move data on DICOM servers (PACS, VNA, etc.).
🤝 Contribute • 📝 Report Bug • 📝 Blog Post 1
---------------------------------------------------------------------
🧑⚕️ User: "Any significant findings in John Doe's previous CT report?"
🧠 LLM → ⚙️ Tools:
query_patients → query_studies → query_series → extract_pdf_text_from_dicom
💬 LLM Response: "The report from 2025-03-26 mentions a history of splenomegaly (enlarged spleen)"
🧑⚕️ User: "What's the volume of his spleen at the last scan and the scan today?"
🧠 LLM → ⚙️ Tools:
(query_studies → query_series → move_series → query_series → extract_pdf_text_from_dicom) x2
(The move_series tool sends the latest CT to a DICOM segmentation node, which returns volume PDF report)
💬 LLM Response: "last year 2024-03-26: 412cm³, today 2025-04-10: 350cm³"
---------------------------------------------------------------------
dicom-mcp
provides tools to:
Install using uv or pip:
uv tool install dicom-mcp
Or by cloning the repository:
# Clone and set up development environment
git clone https://github.com/ChristianHinge/dicom-mcp
cd dicom mcp
# Create and activate virtual environment
uv venv
source .venv/bin/activate
# Install with test dependencies
uv pip install -e ".[dev]"
dicom-mcp
requires a YAML configuration file (config.yaml
or similar) defining DICOM nodes and calling AE titles. Adapt the configuration or keep as is for compatibility with the sample ORTHANC Server.
nodes:
main:
host: "localhost"
port: 4242
ae_title: "ORTHANC"
description: "Local Orthanc DICOM server"
current_node: "main"
calling_aet: "MCPSCU"
[!WARNING] DICOM-MCP is not meant for clinical use, and should not be connected with live hospital databases or databases with patient-sensitive data. Doing so could lead to both loss of patient data, and leakage of patient data onto the internet. DICOM-MCP can be used with locally hosted open-weight LLMs for complete data privacy.
If you don't have a DICOM server available, you can run a local ORTHANC server using Docker:
Clone the repository and install test dependencies pip install -e ".[dev]
cd tests
docker ocmpose up -d
cd ..
pytest # uploads dummy pdf data to ORTHANC server
UI at http://localhost:8042
Add to your client configuration (e.g. claude_desktop_config.json
):
{
"mcpServers": {
"dicom": {
"command": "uv",
"args": ["tool","dicom-mcp", "/path/to/your_config.yaml"]
}
}
}
For development:
{
"mcpServers": {
"arxiv-mcp-server": {
"command": "uv",
"args": [
"--directory",
"path/to/cloned/dicom-mcp",
"run",
"dicom-mcp",
"/path/to/your_config.yaml"
]
}
}
}
dicom-mcp
provides four categories of tools for interaction with DICOM servers and DICOM data.
query_patients
: Search for patients based on criteria like name, ID, or birth date.query_studies
: Find studies using patient ID, date, modality, description, accession number, or Study UID.query_series
: Locate series within a specific study using modality, series number/description, or Series UID.query_instances
: Find individual instances (images/objects) within a series using instance number or SOP Instance UIDextract_pdf_text_from_dicom
: Retrieve a specific DICOM instance containing an encapsulated PDF and extract its text content.move_series
: Send a specific DICOM series to another configured DICOM node using C-MOVE.move_study
: Send an entire DICOM study to another configured DICOM node using C-MOVE.list_dicom_nodes
: Show the currently active DICOM node and list all configured nodes.switch_dicom_node
: Change the active DICOM node for subsequent operations.verify_connection
: Test the DICOM network connection to the currently active node using C-ECHO.get_attribute_presets
: List the available levels of detail (minimal, standard, extended) for metadata query results.The tools can be chained together to answer complex questions:
Tests require a running Orthanc DICOM server. You can use Docker:
# Navigate to the directory containing docker-compose.yml (e.g., tests/)
cd tests
docker-compose up -d
Run tests using pytest:
# From the project root directory
pytest
Stop the Orthanc container:
cd tests
docker-compose down
Use the MCP Inspector for debugging the server communication:
npx @modelcontextprotocol/inspector uv run dicom-mcp /path/to/your_config.yaml --transport stdio
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "dicom-mcp": { "command": "npx", "args": [ "dicom-mcp", "/path/to/your_config.yaml" ], "env": {} } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.