by snikithag
Logs patient symptoms, retrieves similar cases, and searches medical documents using retrieval‑augmented generation to support clinical decision making.
HealthcareRAGTools provides an agentic AI assistant for healthcare professionals. It captures patient symptom entries, compares them with previously logged cases, and searches a curated collection of medical documents. The system leverages Retrieval‑Augmented Generation (RAG) with a Groq‑hosted Llama 3.3 model and stores embeddings in a Chroma vector database.
pip install uv
), and a Groq API key.uv venv
ragmcp\Scripts\activate
set UV_LINK_MODE=copy
uv pip install -r requirements.txt
GROQ_API_KEY=your‑groq‑api‑key
to the .env
file.uv run python setup_db.py
uv run python healthcare_client.py
Interact with the chat interface to log symptoms or search documents.uv run python server.py
Copy the generated JSON configuration into Cursor → Settings → MCP/MCP Tools.healthcare.json
allowing easy adaptation to different datasets or models..txt
(or supported) files in the documents/
folder and re‑run setup_db.py
to refresh the vector store.healthcare.json
and ensure the corresponding API key is available..env
file, which is excluded via .gitignore
.patient_records.json
with additional entries; the server will automatically index them on startup.HealthcareRAGTools Project
Overview
The HealthcareRAGTools project is an agentic AI system designed to assist healthcare professionals by leveraging Retrieval-Augmented Generation (RAG) techniques. This project integrates a Model Context Protocol (MCP) server with a Chroma vector database to log patient symptoms, retrieve similar cases, and search medical documents. It supports interactive queries via a terminal-based client and can also be used within the Cursor IDE’s agent chat interface.
Project Description
This project enables the following key functionalities:
Tools Used
The project relies on the following tools and technologies:
Python 3.8+: The primary programming language for scripting and server logic. FastMCP: A modular compute platform framework for building and running agentic systems. LangChain: A library for building context-aware language models and agents. LangChain-Groq: Integration with Groq’s API for advanced language model capabilities. Chroma: An open-source vector database for storing and retrieving embeddings of medical documents and patient data. Sentence-Transformers: Used to generate embeddings for text data in the Chroma database. HTTpx: For handling HTTP requests within the system. Python-Dotenv: Manages environment variables, such as the Groq API key. LangChain-Community: Additional community-supported LangChain tools. Requests: For making HTTP requests to external services. UV: A package and virtual environment manager for dependency management. Cursor IDE: The development environment, with plans to enable agent chat functionality.
Directory Structure
The project is organized as follows:
\Desktop\mcpserver\ragmcp
├── .env # Stores the GROQ_API_KEY environment variable
├── .gitignore # Excludes venv, chroma_db, and .env from version control
├── server.py # Main MCP server script with HealthcareRAGTools logic
├── setup_db.py # Script to initialize the Chroma vector database
├── healthcare_client.py # Terminal-based client for interactive queries
├── healthcare.json # Configuration file for the MCP server
├── requirements.txt # Lists project dependencies
├── documents\ # Directory for sample medical documents
│ ├── doc1.txt # Example document: Flu symptoms
│ ├── doc2.txt # Example document: Asthma symptoms
├── patient_records.json # JSON file storing patient symptom data
├── chroma_db\ # Directory for the Chroma vector database
└── ragmcp\ # Virtual environment directory
Setup and Installation
To set up the project on your local machine, follow these steps:
Prerequisites Windows 10/11 with Command Prompt. Python 3.8+ installed. UV (Universal Virtual Environment) installed: pip install uv. A Groq API key from console.groq.com. Steps Create and Activate the Virtual Environment: uv venv ragmcp\Scripts\activate Confirm the prompt shows (ragmcp). Install Dependencies: Ensure requirements.txt exists with the listed dependencies: set UV_LINK_MODE=copy uv pip install -r requirements.txt Configure Environment Variables: Create or edit .env with your Groq API key: GROQ_API_KEY=your-groq-api-key Initialize the Database: Run the setup script to populate the Chroma database with sample documents: uv run python setup_db.py Expected output: Vector database initialized with sample medical documents. Running the Project Terminal-Based Client Start the Client: uv run python healthcare_client.py Expected Output: Loading environment variables... Loading config file: C:\Users\sniki\OneDrive\Desktop\mcpserver\ragmcp\healthcare.json Initializing HealthcareRAGTools chat... MCPClient initialized ChatGroq initialized
===== Interactive HealthcareRAGTools Chat ===== Type 'exit' or 'quit' to end the conversation Type 'clear' to clear conversation history Example queries:
You:
Test a Query: Type: Log symptoms for patient P123: fever and cough, severity Moderate, show similar cases Expected response: Assistant: Symptoms logged for Patient P123: 'fever and cough' (Moderate). Similar cases: None Exit: Type exit or quit.
To run it in Cursor's Agent chat:
Copy the json code into File>Preferences>Cursor Settings>MCP/MCP Tools and run the server as follows:
uv run python server.py
You can use the same queries here as well.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process.
by danny-avila
Provides a self‑hosted ChatGPT‑style interface supporting numerous AI models, agents, code interpreter, image generation, multimodal interactions, and secure multi‑user authentication.
by block
Automates engineering tasks on local machines, executing code, building projects, debugging, orchestrating workflows, and interacting with external APIs using any LLM.
by RooCodeInc
Provides an autonomous AI coding partner inside the editor that can understand natural language, manipulate files, run commands, browse the web, and be customized via modes and instructions.
by pydantic
A Python framework that enables seamless integration of Pydantic validation with large language models, providing type‑safe agent construction, dependency injection, and structured output handling.
by lastmile-ai
Build effective agents using Model Context Protocol and simple, composable workflow patterns.
by mcp-use
A Python SDK that simplifies interaction with MCP servers and enables developers to create custom agents with tool‑calling capabilities.
by nanbingxyz
A cross‑platform desktop AI assistant that connects to major LLM providers, supports a local knowledge base, and enables tool integration via MCP servers.
by gptme
Provides a personal AI assistant that runs directly in the terminal, capable of executing code, manipulating files, browsing the web, using vision, and interfacing with various LLM providers.