by DMontgomery40
A Model Context Protocol server that proxies DeepSeek's language models, enabling seamless integration with MCP‑compatible applications.
Provides a proxy server that exposes DeepSeek's advanced language models (R1 and V3) through the Model Context Protocol, allowing tools like Claude Desktop to interact with them without exposing the API key directly.
npm install -g deepseek-mcp-server
# or via Smithery
npx -y @smithery/cli install @dmontgomery40/deepseek-mcp-server --client claude
claude_desktop_config.json
:
{
"mcpServers": {
"deepseek": {
"command": "npx",
"args": ["-y", "deepseek-mcp-server"],
"env": { "DEEPSEEK_API_KEY": "your-api-key" }
}
}
}
deepseek-reasoner
) to V3 (deepseek-chat
) if the primary model is unavailable.model-config
and models
resources.Q: Which model is used by default?
A: The server starts with the R1 model (deepseek-reasoner
). If it fails, it switches to V3 (deepseek-chat
).
Q: How do I change the model during a conversation?
A: Simply include a natural language command like “use deepseek-chat
” or “switch to deepseek-reasoner
”.
Q: Do I need to rebuild the project to change settings? A: No. All configuration changes are handled at runtime through the MCP interface.
Q: Can I test the server locally?
A: Yes. Build with npm run build
and run with the MCP Inspector:
npx @modelcontextprotocol/inspector node ./build/index.js
A Model Context Protocol (MCP) server for the DeepSeek API, allowing seamless integration of DeepSeek's powerful language models with MCP-compatible applications like Claude Desktop.
To install DeepSeek MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @dmontgomery40/deepseek-mcp-server --client claude
npm install -g deepseek-mcp-server
Add this to your claude_desktop_config.json
:
{
"mcpServers": {
"deepseek": {
"command": "npx",
"args": [
"-y",
"deepseek-mcp-server"
],
"env": {
"DEEPSEEK_API_KEY": "your-api-key"
}
}
}
}
Note: The server intelligently handles these natural language requests by mapping them to appropriate configuration changes. You can also query the current settings and available models:
deepseek-reasoner
in the server), the server will automatically attempt to try with v3 (called deepseek-chat
in the server)Note: You can switch back and forth anytime as well, by just giving your prompt and saying "use
deepseek-reasoner
" or "usedeepseek-chat
"
Multi-turn conversation support:
This feature is particularly valuable for two key use cases:
Training & Fine-tuning: Since DeepSeek is open source, many users are training their own versions. The multi-turn support provides properly formatted conversation data that's essential for training high-quality dialogue models.
Complex Interactions: For production use, this helps manage longer conversations where context is crucial:
The implementation handles all context management and message formatting behind the scenes, letting you focus on the actual interaction rather than the technical details of maintaining conversation state.
You can test the server locally using the MCP Inspector tool:
Build the server:
npm run build
Run the server with MCP Inspector:
# Make sure to specify the full path to the built server
npx @modelcontextprotocol/inspector node ./build/index.js
The inspector will open in your browser and connect to the server via stdio transport. You can:
Note: The server uses DeepSeek's R1 model (deepseek-reasoner) by default, which provides state-of-the-art performance for reasoning and general tasks.
MIT
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "deepseek": { "command": "npx", "args": [ "-y", "deepseek-mcp-server" ], "env": { "DEEPSEEK_API_KEY": "<YOUR_API_KEY>" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by deepfates
Runs Replicate models through the Model Context Protocol, exposing tools for model discovery, prediction management, and image handling via a simple CLI interface.
by 66julienmartin
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
by ruixingshi
Provides Deepseek model's chain‑of‑thought reasoning to MCP‑enabled AI clients, supporting both OpenAI API mode and local Ollama mode.
by groundlight
Expose HuggingFace zero‑shot object detection models as tools for large language or vision‑language models, enabling object localisation and zoom functionality on images.
by 66julienmartin
Provides a Model Context Protocol server for the Qwen Max language model, enabling seamless integration with Claude Desktop and other MCP‑compatible clients.
by Verodat
Enables AI models to interact with Verodat's data management capabilities through a set of standardized tools for retrieving, creating, and managing datasets.
Run advanced AI models locally with high performance while maintaining full data privacy, accessible through native desktop applications and a browser‑based platform.
Upload, analyze, and visualize documents, compare multiple AI model responses side‑by‑side, generate diagrams, solve math with KaTeX, and collaborate securely within a single unified interface.
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.