by ruixingshi
Provides Deepseek model's chain‑of‑thought reasoning to MCP‑enabled AI clients, supporting both OpenAI API mode and local Ollama mode.
Deepseek Thinker MCP Server delivers the reasoning process (CoT) generated by the Deepseek model to any MCP‑compatible AI client such as Claude Desktop. It acts as a bridge that queries either the official Deepseek API or a locally hosted Ollama instance and returns structured reasoning output.
npx
command to run it without a global install.API_KEY
(your Deepseek/OpenAI key) and BASE_URL
(API endpoint).USE_OLLAMA=true
.mcpServers
section of the client configuration (e.g., claude_desktop_config.json
).npx
command; the server will listen for MCP requests and return reasoning results.USE_OLLAMA=true
and having a local Ollama instance with the Deepseek model installed.API_KEY
and BASE_URL
. For Ollama mode: USE_OLLAMA=true
.npm install
, npm run build
, then node build/index.js
.A MCP (Model Context Protocol) provider Deepseek reasoning content to MCP-enabled AI Clients, like Claude Desktop. Supports access to Deepseek's thought processes from the Deepseek API service or from a local Ollama server.
🤖 Dual Mode Support
🎯 Focused Reasoning
originPrompt
(string): User's original promptSet the following environment variables:
API_KEY=<Your OpenAI API Key>
BASE_URL=<API Base URL>
Set the following environment variable:
USE_OLLAMA=true
Add the following configuration to your claude_desktop_config.json
:
{
"mcpServers": {
"deepseek-thinker": {
"command": "npx",
"args": [
"-y",
"deepseek-thinker-mcp"
],
"env": {
"API_KEY": "<Your API Key>",
"BASE_URL": "<Your Base URL>"
}
}
}
}
{
"mcpServers": {
"deepseek-thinker": {
"command": "npx",
"args": [
"-y",
"deepseek-thinker-mcp"
],
"env": {
"USE_OLLAMA": "true"
}
}
}
}
{
"mcpServers": {
"deepseek-thinker": {
"command": "node",
"args": [
"/your-path/deepseek-thinker-mcp/build/index.js"
],
"env": {
"API_KEY": "<Your API Key>",
"BASE_URL": "<Your Base URL>"
}
}
}
}
# Install dependencies
npm install
# Build project
npm run build
# Run service
node build/index.js
This error occurs when the Deepseek API response is too slow or when the reasoning content output is too long, causing the MCP server to timeout.
This project is licensed under the MIT License. See the LICENSE file for details.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "deepseek-thinker": { "command": "npx", "args": [ "-y", "deepseek-thinker-mcp" ], "env": { "API_KEY": "<YOUR_API_KEY>", "BASE_URL": "<YOUR_BASE_URL>" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by DMontgomery40
A Model Context Protocol server that proxies DeepSeek's language models, enabling seamless integration with MCP‑compatible applications.
by deepfates
Runs Replicate models through the Model Context Protocol, exposing tools for model discovery, prediction management, and image handling via a simple CLI interface.
by 66julienmartin
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
by groundlight
Expose HuggingFace zero‑shot object detection models as tools for large language or vision‑language models, enabling object localisation and zoom functionality on images.
by 66julienmartin
Provides a Model Context Protocol server for the Qwen Max language model, enabling seamless integration with Claude Desktop and other MCP‑compatible clients.
by Verodat
Enables AI models to interact with Verodat's data management capabilities through a set of standardized tools for retrieving, creating, and managing datasets.
Run advanced AI models locally with high performance while maintaining full data privacy, accessible through native desktop applications and a browser‑based platform.
Upload, analyze, and visualize documents, compare multiple AI model responses side‑by‑side, generate diagrams, solve math with KaTeX, and collaborate securely within a single unified interface.
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.