by deepfates
Runs Replicate models through the Model Context Protocol, exposing tools for model discovery, prediction management, and image handling via a simple CLI interface.
Provides a Model Context Protocol (MCP) server implementation that lets users interact with Replicate's AI models. The server offers tool‑based commands for searching models, creating and monitoring predictions, and handling generated images, all usable from MCP‑compatible clients such as Claude Desktop, Cursor, Cline, or Continue.
Installation
npm install -g mcp-replicate # or use npx for a one‑off run
Or run directly without installing:
npx -y mcp-replicate
Configuration
{
"mcpServers": {
"replicate": {
"command": "mcp-replicate",
"env": { "REPLICATE_API_TOKEN": "your_token_here" }
}
}
}
or export REPLICATE_API_TOKEN=your_token_here
before launching the server.
3. Start the server (npx -y mcp-replicate
) and ensure your MCP client points to it.
search_models
, list_models
, get_model
, list_collections
, get_collection
.create_prediction
, create_and_poll_prediction
, get_prediction
, cancel_prediction
, list_predictions
.view_image
, clear_image_cache
, get_image_cache_stats
.Q: The server is running but tools don't appear in my client.
A: Verify the MCP server settings in your client, ensure the REPLICATE_API_TOKEN
is correctly set, and restart both the server and the client.
Q: Where should I set the Replicate API token? A: Prefer adding it to the client’s configuration file (as shown above). Alternatively, export it as an environment variable before launching the server.
Q: What Node.js version is required? A: Node.js >= 18.0.0.
Q: Can I run the server without installing it globally?
A: Yes, use npx -y mcp-replicate
for a one‑time execution.
Q: How do I view generated images?
A: Use the view_image
tool, which opens the image URL in the default web browser.
A Model Context Protocol server implementation for Replicate. Run Replicate models through a simple tool-based interface.
npm install -g mcp-replicate
Get your Replicate API token:
Configure Claude Desktop:
your_token_here
with your actual Replicate API token:{
"mcpServers": {
"replicate": {
"command": "mcp-replicate",
"env": {
"REPLICATE_API_TOKEN": "your_token_here"
}
}
}
}
(You can also use any other MCP client, such as Cursor, Cline, or Continue.)
git clone https://github.com/deepfates/mcp-replicate
cd mcp-replicate
npm install
npm run build
npm start
npx mcp-replicate
The server needs a Replicate API token to work. You can get one at Replicate.
There are two ways to provide the token:
Add it to your Claude Desktop configuration as shown in the Quickstart section:
{
"mcpServers": {
"replicate": {
"command": "mcp-replicate",
"env": {
"REPLICATE_API_TOKEN": "your_token_here"
}
}
}
}
Alternatively, you can set it as an environment variable if you're using another MCP client:
export REPLICATE_API_TOKEN=your_token_here
search_models
: Find models using semantic searchlist_models
: Browse available modelsget_model
: Get details about a specific modellist_collections
: Browse model collectionsget_collection
: Get details about a specific collectioncreate_prediction
: Run a model with your inputscreate_and_poll_prediction
: Run a model with your inputs and wait until it's completedget_prediction
: Check a prediction's statuscancel_prediction
: Stop a running predictionlist_predictions
: See your recent predictionsview_image
: Open an image in your browserclear_image_cache
: Clean up cached imagesget_image_cache_stats
: Check cache usagenpm install
npm run dev
npm run lint
npm run format
MIT
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "replicate": { "command": "npx", "args": [ "-y", "mcp-replicate" ], "env": { "REPLICATE_API_TOKEN": "<YOUR_API_KEY>" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by DMontgomery40
A Model Context Protocol server that proxies DeepSeek's language models, enabling seamless integration with MCP‑compatible applications.
by 66julienmartin
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
by ruixingshi
Provides Deepseek model's chain‑of‑thought reasoning to MCP‑enabled AI clients, supporting both OpenAI API mode and local Ollama mode.
by groundlight
Expose HuggingFace zero‑shot object detection models as tools for large language or vision‑language models, enabling object localisation and zoom functionality on images.
by 66julienmartin
Provides a Model Context Protocol server for the Qwen Max language model, enabling seamless integration with Claude Desktop and other MCP‑compatible clients.
by Verodat
Enables AI models to interact with Verodat's data management capabilities through a set of standardized tools for retrieving, creating, and managing datasets.
Run advanced AI models locally with high performance while maintaining full data privacy, accessible through native desktop applications and a browser‑based platform.
Upload, analyze, and visualize documents, compare multiple AI model responses side‑by‑side, generate diagrams, solve math with KaTeX, and collaborate securely within a single unified interface.
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.