by 66julienmartin
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
The Deepseek R1 MCP Server is a Node.js/TypeScript implementation of a Model Context Protocol (MCP) server. It facilitates the connection between Claude Desktop and DeepSeek's powerful language models, specifically R1 and V3. Deepseek R1 is highlighted for its reasoning capabilities and an 8192 token context window. The choice of Node.js is attributed to its stability, type safety, error handling, and compatibility with Claude Desktop for MCP servers.
git clone https://github.com/66julienmartin/MCP-server-Deepseek_R1.git
cd deepseek-r1-mcp
npm install
.env.example
to .env
and add your Deepseek API key.
cp .env.example .env
# Add DEEPSEEK_API_KEY=your-api-key-here to the .env file
npm run build
{
"mcpServers": {
"deepseek_r1": {
"command": "node",
"args": ["/path/to/deepseek-r1-mcp/build/index.js"],
"env": {
"DEEPSEEK_API_KEY": "your-api-key"
}
}
}
}
deepseek-R1
. To use DeepSeek-V3
, change the model
value in src/index.ts
from "deepseek-reasoner"
to "deepseek-chat"
.max_tokens
and temperature
.model
name in src/index.ts
. Use "deepseek-reasoner"
for R1 and "deepseek-chat"
for V3.temperature
parameter used? The temperature
parameter controls the randomness of the output. Lower values (e.g., 0.0) are for deterministic tasks like coding, while higher values (e.g., 1.5) are for creative tasks. The project provides a table suggesting temperature values for different use cases.The project is licensed under the MIT license.
A Model Context Protocol (MCP) server implementation for the Deepseek R1 language model. Deepseek R1 is a powerful language model optimized for reasoning tasks with a context window of 8192 tokens.
Why Node.js? This implementation uses Node.js/TypeScript as it provides the most stable integration with MCP servers. The Node.js SDK offers better type safety, error handling, and compatibility with Claude Desktop.
# Clone and install
git clone https://github.com/66julienmartin/MCP-server-Deepseek_R1.git
cd deepseek-r1-mcp
npm install
# Set up environment
cp .env.example .env # Then add your API key
# Build and run
npm run build
By default, this server uses the deepseek-R1 model. If you want to use DeepSeek-V3 instead, modify the model name in src/index.ts
:
// For DeepSeek-R1 (default)
model: "deepseek-reasoner"
// For DeepSeek-V3
model: "deepseek-chat"
deepseek-r1-mcp/
├── src/
│ ├── index.ts # Main server implementation
├── build/ # Compiled files
│ ├── index.js
├── LICENSE
├── README.md
├── package.json
├── package-lock.json
└── tsconfig.json
.env
file:DEEPSEEK_API_KEY=your-api-key-here
{
"mcpServers": {
"deepseek_r1": {
"command": "node",
"args": ["/path/to/deepseek-r1-mcp/build/index.js"],
"env": {
"DEEPSEEK_API_KEY": "your-api-key"
}
}
}
}
npm run dev # Watch mode
npm run build # Build for production
{
"name": "deepseek_r1",
"arguments": {
"prompt": "Your prompt here",
"max_tokens": 8192, // Maximum tokens to generate
"temperature": 0.2 // Controls randomness
}
}
The default value of temperature
is 0.2.
Deepseek recommends setting the temperature
according to your specific use case:
USE CASE | TEMPERATURE | EXAMPLE |
---|---|---|
Coding / Math | 0.0 | Code generation, mathematical calculations |
Data Cleaning / Data Analysis | 1.0 | Data processing tasks |
General Conversation | 1.3 | Chat and dialogue |
Translation | 1.3 | Language translation |
Creative Writing / Poetry | 1.5 | Story writing, poetry generation |
The server provides detailed error messages for common issues:
Contributions are welcome! Please feel free to submit a Pull Request.
MIT
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "deepseek_r1": { "command": "node", "args": [ "/path/to/deepseek-r1-mcp/build/index.js" ], "env": { "DEEPSEEK_API_KEY": "your-api-key" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by DMontgomery40
A Model Context Protocol server that proxies DeepSeek's language models, enabling seamless integration with MCP‑compatible applications.
by deepfates
Runs Replicate models through the Model Context Protocol, exposing tools for model discovery, prediction management, and image handling via a simple CLI interface.
by ruixingshi
Provides Deepseek model's chain‑of‑thought reasoning to MCP‑enabled AI clients, supporting both OpenAI API mode and local Ollama mode.
by groundlight
Expose HuggingFace zero‑shot object detection models as tools for large language or vision‑language models, enabling object localisation and zoom functionality on images.
by 66julienmartin
Provides a Model Context Protocol server for the Qwen Max language model, enabling seamless integration with Claude Desktop and other MCP‑compatible clients.
by Verodat
Enables AI models to interact with Verodat's data management capabilities through a set of standardized tools for retrieving, creating, and managing datasets.
Run advanced AI models locally with high performance while maintaining full data privacy, accessible through native desktop applications and a browser‑based platform.
Upload, analyze, and visualize documents, compare multiple AI model responses side‑by‑side, generate diagrams, solve math with KaTeX, and collaborate securely within a single unified interface.
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.