by universal-mcp
Provides a standardized interface for interacting with Braze's tools and services through a unified API using the Universal MCP framework.
The Braze Universal MCP Server implements a Model Context Protocol (MCP) service that exposes Braze's marketing and engagement capabilities via a common, language‑agnostic API. It allows developers to call Braze functionality (such as user profiling, campaign triggers, and analytics) without dealing with Braze‑specific SDKs.
uv
package manager.uv sync
to create a virtual environment and install the project's packages.mcp dev src/universal_mcp_braze/server.py
to launch the server locally and obtain the address/port.mcp install src/universal_mcp_braze/server.py
so the service becomes discoverable by other MCP‑compatible tools.src/universal_mcp_braze/README.md
).uv sync
, mcp dev
, mcp install
) for rapid iteration.src/universal_mcp_braze/app.py
.Q: Do I need a Braze account to run the server?
A: Yes. The server forwards calls to Braze’s REST endpoints, so a valid Braze API key and project configuration are required (set via environment variables in .env
).
Q: Can I run this server in production?
A: Absolutely. Deploy the same entry point (src/universal_mcp_braze/server.py
) to any environment that supports Python 3.11+, and configure the necessary Braze credentials.
Q: How do I add new Braze operations?
A: Implement additional tool classes in src/universal_mcp_braze/app.py
following the existing pattern and update the README for the tool list.
Q: Is there built‑in authentication for the MCP server? A: Authentication is handled by the underlying MCP framework; you can configure additional security layers (e.g., API tokens) as needed.
This repository contains an implementation of an Braze Universal MCP (Model Context Protocol) server. It provides a standardized interface for interacting with Braze's tools and services through a unified API.
The server is built using the Universal MCP framework.
This implementation follows the MCP specification, ensuring compatibility with other MCP-compliant services and tools.
You can start using Braze directly from agentr.dev. Visit agentr.dev/apps and enable Braze.
If you have not used universal mcp before follow the setup instructions at agentr.dev/quickstart
The full list of available tools is at ./src/universal_mcp_braze/README.md
Ensure you have the following before you begin:
pip install uv
)Follow the steps below to set up your development environment:
Sync Project Dependencies
uv sync
This installs all dependencies from pyproject.toml
into a local virtual environment (.venv
).
Activate the Virtual Environment
For Linux/macOS:
source .venv/bin/activate
For Windows (PowerShell):
.venv\Scripts\Activate
Start the MCP Inspector
mcp dev src/universal_mcp_braze/server.py
This will start the MCP inspector. Make note of the address and port shown in the console output.
Install the Application
mcp install src/universal_mcp_braze/server.py
.
├── src/
│ └── universal_mcp_braze/
│ ├── __init__.py # Package initializer
│ ├── server.py # Server entry point
│ ├── app.py # Application tools
│ └── README.md # List of application tools
├── tests/ # Test suite
├── .env # Environment variables for local development
├── pyproject.toml # Project configuration
└── README.md # This file
This project is licensed under the MIT License.
Generated with MCP CLI — Happy coding! 🚀
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.