by YanxingLiu
Enables invocation of Dify workflows through Model Context Protocol, allowing tools to call Dify’s capabilities from any MCP‑compatible client.
The server acts as a bridge between Dify’s workflow engine and any client that supports the Model Context Protocol (MCP). By exposing Dify workflows as MCP tools, developers can trigger complex AI pipelines without writing custom integration code.
DIFY_BASE_URL, DIFY_APP_SKS) or a config.yaml file.uv/uvx (a fast Python package manager) with curl -Ls https://astral.sh/uv/install.sh | sh.uvx --from git+https://github.com/YanxingLiu/dify-mcp-server dify_mcp_serveruv --directory /path/to/dify-mcp-server run dify_mcp_serverconfig.yaml for local testing.uvx runs the server directly from the Git repository, simplifying CI/CD integration.uv.Q: Do I need to clone the repository to run the server?
A: No. Using uvx you can run it directly from Git. Cloning is only required if you prefer a local development setup.
Q: How many Dify workflows can I expose?
A: Any number; just list their App SKs in DIFY_APP_SKS (comma‑separated) or add them to the dify_app_sks array in config.yaml.
Q: What if I want to change the base URL after the server starts? A: Update the environment variable or the YAML file and restart the server; the new value will be read on startup.
Q: Is there a Docker image available?
A: The README does not mention an official Docker image, but you can containerise the server by installing uv inside a base Python image and running the same command.
Q: Which Python version is required?
A: The project follows standard Python 3.x conventions; uv will resolve compatible versions automatically.
A simple implementation of an MCP server for using dify. It achieves the invocation of the Dify workflow by calling the tools of MCP.
base_url and app_sks, making it more convenient to use with cloud-hosted platforms.The server can be installed via Smithery or manually.
You can configure the server using either environment variables or a config.yaml file.
Set the following environment variables:
export DIFY_BASE_URL="https://cloud.dify.ai/v1"
export DIFY_APP_SKS="app-sk1,app-sk2" # Comma-separated list of your Dify App SKs
DIFY_BASE_URL: The base URL for your Dify API.DIFY_APP_SKS: A comma-separated list of your Dify App Secret Keys (SKs). Each SK typically corresponds to a different Dify workflow you want to make available via MCP.config.yamlCreate a config.yaml file to store your Dify base URL and App SKs.
Example config.yaml:
dify_base_url: "https://cloud.dify.ai/v1"
dify_app_sks:
- "app-sk1" # SK for workflow 1
- "app-sk2" # SK for workflow 2
# Add more SKs as needed
dify_base_url: The base URL for your Dify API.dify_app_sks: A list of your Dify App Secret Keys (SKs). Each SK typically corresponds to a different Dify workflow.You can create this file quickly using the following command (adjust the path and values as needed):
# Create a directory if it doesn't exist
mkdir -p ~/.config/dify-mcp-server
# Create the config file
cat > ~/.config/dify-mcp-server/config.yaml <<EOF
dify_base_url: "https://cloud.dify.ai/v1"
dify_app_sks:
- "app-your-sk-1"
- "app-your-sk-2"
EOF
echo "Configuration file created at ~/.config/dify-mcp-server/config.yaml"
When running the server (as shown in Step 2), you will need to provide the path to this config.yaml file via the CONFIG_PATH environment variable if you choose this method.
❓ If you haven't installed uv or uvx yet, you can do it quickly with the following command:
curl -Ls https://astral.sh/uv/install.sh | sh
{
"mcpServers": {
"dify-mcp-server": {
"command": "uvx",
"args": [
"--from","git+https://github.com/YanxingLiu/dify-mcp-server","dify_mcp_server"
],
"env": {
"DIFY_BASE_URL": "https://cloud.dify.ai/v1",
"DIFY_APP_SKS": "app-sk1,app-sk2",
}
}
}
}
or
{
"mcpServers": {
"dify-mcp-server": {
"command": "uvx",
"args": [
"--from","git+https://github.com/YanxingLiu/dify-mcp-server","dify_mcp_server"
],
"env": {
"CONFIG_PATH": "/Users/lyx/Downloads/config.yaml"
}
}
}
}
You can also run the dify mcp server manually in your clients. The config of client should like the following format:
{
"mcpServers": {
"mcp-server-rag-web-browser": {
"command": "uv",
"args": [
"--directory", "${DIFY_MCP_SERVER_PATH}",
"run", "dify_mcp_server"
],
"env": {
"CONFIG_PATH": "$CONFIG_PATH"
}
}
}
}
or
{
"mcpServers": {
"mcp-server-rag-web-browser": {
"command": "uv",
"args": [
"--directory", "${DIFY_MCP_SERVER_PATH}",
"run", "dify_mcp_server"
],
"env": {
"CONFIG_PATH": "$CONFIG_PATH"
}
}
}
}
Example config:
{
"mcpServers": {
"dify-mcp-server": {
"command": "uv",
"args": [
"--directory", "/Users/lyx/Downloads/dify-mcp-server",
"run", "dify_mcp_server"
],
"env": {
"DIFY_BASE_URL": "https://cloud.dify.ai/v1",
"DIFY_APP_SKS": "app-sk1,app-sk2",
}
}
}
}
At last, you can use dify tools in any client who supports mcp.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by activepieces
A self‑hosted, open‑source platform that provides a no‑code builder for creating, versioning, and running AI‑driven automation workflows. Pieces are TypeScript‑based plugins that become MCP servers, allowing direct consumption by large language models.
by Skyvern-AI
Automates browser‑based workflows by leveraging large language models and computer‑vision techniques, turning natural‑language prompts into fully functional web interactions without writing custom scripts.
by ahujasid
Enables Claude AI to control Blender for prompt‑assisted 3D modeling, scene creation, and manipulation via a socket‑based Model Context Protocol server.
by PipedreamHQ
Connect APIs quickly with a free, hosted integration platform that enables event‑driven automations across 1,000+ services and supports custom code in Node.js, Python, Go, or Bash.
by elie222
Organizes email inbox, drafts replies in the user's tone, tracks follow‑ups, and provides analytics to achieve inbox zero quickly.
by grab
Enables Cursor AI to read and programmatically modify Figma designs through a Model Context Protocol integration.
by CursorTouch
Enables AI agents to control the Windows operating system, performing file navigation, application launching, UI interaction, QA testing, and other automation tasks through a lightweight server.
by ahujasid
Enables Claude AI to control Ableton Live in real time, allowing AI‑driven creation, editing, and playback of tracks, clips, instruments, and effects through a socket‑based server.
by leonardsellem
Provides tools and resources to enable AI assistants to manage and execute n8n workflows via natural language commands.