by rajvirtual
Provides secure access to Oura Ring health metrics via the Model Controller Protocol, enabling AI assistants to fetch, analyze, and visualize personal wellness data.
Enables AI assistants to retrieve and interpret a user’s Oura Ring data—sleep, activity, readiness, heart rate, and more—through a structured MCP interface.
git clone https://github.com/rajvirtual/oura-mcp-server.git
cd oura-mcp-server
npm install
.env
file in the project root:
OURA_TOKEN=your_personal_access_token_here
npm run build
npx -y oura-mcp-server
The server will listen for MCP requests from compatible AI assistants.Q: Which Oura data can be accessed? A: All metrics provided by the Oura API, including sleep, activity, readiness, heart rate, HRV, and custom tags.
Q: Do I need to run the server continuously? A: Yes, the server must be running to respond to MCP calls from the AI assistant.
Q: How are visualizations delivered? A: The AI includes a “visualize” request; the server generates image files (PNG/GIF) and returns URLs or embeds them in the response.
Q: Is the token stored securely?
A: The token is read from the .env
file at runtime and never hard‑coded in source code.
Q: Can I extend the prompt library? A: Absolutely—add new prompt templates in the server’s configuration to tailor analyses to personal needs.
This server enables AI assistants to access and analyze your Oura Ring data through the Model Controller Protocol (MCP). It provides a structured way to fetch and understand your health metrics.
Clone the repository:
git clone https://github.com/yourusername/oura-mcp-server.git
cd oura-mcp-server
Install dependencies:
npm install
Create a .env
file in the root directory with your Oura API token:
OURA_TOKEN=your_personal_access_token_here
Build the project:
npm run build
Start the server:
npm start
.env
fileYou can ask Claude things like:
Claude can create visual charts to help you understand your health data. Simply ask Claude to "visualize" or "create a chart" of specific metrics. For example:
This server follows these key guidelines:
Contributions are welcome! Please feel free to submit a Pull Request.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "oura-mcp-server": { "command": "npx", "args": [ "-y", "oura-mcp-server" ], "env": { "OURA_TOKEN": "<YOUR_PERSONAL_ACCESS_TOKEN>" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process.
by danny-avila
Provides a self‑hosted ChatGPT‑style interface supporting numerous AI models, agents, code interpreter, image generation, multimodal interactions, and secure multi‑user authentication.
by block
Automates engineering tasks on local machines, executing code, building projects, debugging, orchestrating workflows, and interacting with external APIs using any LLM.
by RooCodeInc
Provides an autonomous AI coding partner inside the editor that can understand natural language, manipulate files, run commands, browse the web, and be customized via modes and instructions.
by pydantic
A Python framework that enables seamless integration of Pydantic validation with large language models, providing type‑safe agent construction, dependency injection, and structured output handling.
by lastmile-ai
Build effective agents using Model Context Protocol and simple, composable workflow patterns.
by mcp-use
A Python SDK that simplifies interaction with MCP servers and enables developers to create custom agents with tool‑calling capabilities.
by nanbingxyz
A cross‑platform desktop AI assistant that connects to major LLM providers, supports a local knowledge base, and enables tool integration via MCP servers.
by gptme
Provides a personal AI assistant that runs directly in the terminal, capable of executing code, manipulating files, browsing the web, using vision, and interfacing with various LLM providers.