by teddyzxcv
Send ntfy push notifications whenever an AI assistant finishes a task, allowing you to stay informed without interrupting your workflow.
Ntfy Mcp integrates with the Model Context Protocol to dispatch ntfy notifications as soon as a task completes. It acts as a lightweight MCP server that bridges AI task results with the ntfy notification service.
git clone https://github.com/teddyzxcv/ntfy-mcp.git
cd ntfy-mcp
npm install
npm run build
npm start
"ntfy-mcp": {
"command": "node",
"args": ["/path/to/ntfy-mcp/build/index.js"],
"env": { "NTFY_TOPIC": "<your topic name>" },
"autoApprove": ["notify_user"]
}
notify me
in your prompt to trigger a notification.Q: Do I need an ntfy account? A: No, ntfy works anonymously; just install the mobile/desktop app and subscribe to a topic.
Q: Which environment variable defines the notification channel?
A: Set NTFY_TOPIC
to the name of the topic you want to receive messages on.
Q: Can I customize the notification message? A: The server sends a simple payload containing the task result; further customization requires extending the source code.
Welcome to ntfy-mcp, the MCP server that keeps you caffeinated and informed! 🚀☕️
This handy little server integrates with the Model Context Protocol to send you delightful ntfy notifications whenever your AI assistant completes a task. Because let's face it - you deserve that tea break while your code writes itself.
git clone https://github.com/teddyzxcv/ntfy-mcp.git
cd ntfy-mcp
npm install
npm run build
Choose your adventure:
Manual Start:
npm start
Cline Configuration:
"ntfy-mcp": {
"command": "node",
"args": [
"/path/to/ntfy-mcp/build/index.js"
],
"env": {
"NTFY_TOPIC": "<your topic name>"
},
"autoApprove": [
"notify_user" // Highly recommended for maximum chill
]
}
Write a prompt like this, otherwise the function won't call
(tried use Custom Instructions
in cline, but they are in the ring 3, so model just forget about it)
Write me a hello world in python, notify me when the task is done
☕️🍵 Your notification will arrive when the task is complete. No peeking!
This MCP server integrates seamlessly with the Model Context Protocol, acting as your personal notification butler. When tasks are completed, it sends notifications via ntfy, keeping you informed without interrupting your flow.
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Copyright 2025 Casey Hand @cyanheads
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
Now go forth and code with confidence, knowing your notifications are in good hands! 🎉
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "ntfy-mcp": { "command": "npx", "args": [ "-y", "ntfy-mcp" ], "env": { "NTFY_TOPIC": "<YOUR_TOPIC>" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.