by whataboutyou-ai
Provides data governance orchestration for LLM pipelines via MCP integration, applying instruments such as PII detection to text streams.
Enables the application of Eunomia instruments (e.g., PII detection, IDBac) to text streams processed by MCP‑based servers, ensuring automatic enforcement of privacy and access policies.
BaseSettings
class, specifying MCP_SERVERS
and an Orchestra
with desired instruments.uv --directory "path/to/server/" run orchestra_server
(or equivalent command)..env
file, starts the configured MCP server(s), and applies the defined instruments to incoming text.PiiInstrument
with replace edit mode.IdbacInstrument
).MCP_SERVERS
configuration.uv
(or an equivalent Python task runner) and the target MCP server packages.env
key of each server entry in the settings.entities
and edit_mode
which can be set when constructing them in the Orchestra
.[!WARNING] This MCP server is deprecated as it is not compatible with the latest developments of Eunomia. A new MCP integration is under development and will be available soon.
Eunomia MCP Server
Open Source Data Governance for LLM-based Applications — with MCP integration
Made with ❤ by the team at What About You.
Read the docs · Join the Discord
Eunomia MCP Server is an extension of the Eunomia framework that connects Eunomia instruments with MCP servers. It provides a simple way to orchestrate data governance policies (like PII detection or user access control) and seamlessly integrate them with external server processes in the MCP ecosystem.
With Eunomia MCP Server, you can:
git clone https://github.com/whataboutyou-ai/eunomia-mcp-server.git
Eunomia MCP Server uses the same "instrument" concept as Eunomia. By defining your set of instruments in an Orchestra
, you can apply data governance policies to text streams that flow through your MCP-based servers.
Below is a simplified example of how to define application settings and run the MCP server with uv.
"""
Example Settings for MCP Orchestra Server
=========================================
This example shows how we can combine Eunomia with a web-browser-mcp-server
(https://github.com/blazickjp/web-browser-mcp-server).
"""
from pydantic_settings import BaseSettings
from pydantic import ConfigDict
from eunomia.orchestra import Orchestra
from eunomia.instruments import IdbacInstrument, PiiInstrument
class Settings(BaseSettings):
"""
Application settings class for MCP orchestra server using pydantic_settings.
Attributes:
APP_NAME (str): Name of the application
APP_VERSION (str): Current version of the application
LOG_LEVEL (str): Logging level (default: "info")
MCP_SERVERS (dict): Servers to be connected
ORCHESTRA (Orchestra): Orchestra class from Eunomia to define data governance policies
"""
APP_NAME: str = "mcp-server_orchestra"
APP_VERSION: str = "0.1.0"
LOG_LEVEL: str = "info"
MCP_SERVERS: dict = {
"web-browser-mcp-server": {
"command": "uv",
"args": [
"tool",
"run",
"web-browser-mcp-server"
],
"env": {
"REQUEST_TIMEOUT": "30"
}
}
}
ORCHESTRA: Orchestra = Orchestra(
instruments=[
PiiInstrument(entities=["EMAIL_ADDRESS", "PERSON"], edit_mode="replace"),
# You can add more instruments here
# e.g., IdbacInstrument(), etc.
]
)
Once your settings are defined, you can run the MCP Orchestra server by pointing uv
to the directory containing your server code, for example:
uv --directory "path/to/server/" run orchestra_server
This will:
.env
or environment variables.PiiInstrument
) to the incoming text, ensuring data governance policies are automatically enforced.For more detailed usage, advanced configuration, and additional instruments, check out the following resources:
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "web-browser-mcp-server": { "command": "npx", "args": [ "-y", "web-browser-mcp-server" ], "env": { "REQUEST_TIMEOUT": "30" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.