by jonathan-politzki
Enables Claude to access, permanently cache, and semantically search your Substack and Medium essays, delivering contextual assistance for writing tasks.
Writer Context Tool is an MCP server that fetches your posts from Substack and Medium via RSS, stores them locally, generates embeddings, and exposes each essay as an individual resource for Claude to query.
uv
or standard venv
).requirements.txt
.config.example.json
to config.json
and fill in your Substack/Medium URLs and optional settings (max posts, cache duration, similar posts count).claude_desktop_config.json
) pointing to the Python command that runs writer_tool.py
(or use the provided run_writer_tool.sh
script).refresh_content
command to update the cache.search_writing
tool to find the most relevant essays.refresh_content
tool to manually refresh the cache.Q: Do my posts stay private? A: The tool only accesses publicly available RSS feeds and caches the content locally on your machine.
Q: How often does the cache refresh?
A: By default the cache refreshes weekly (cache_duration_minutes
= 10080), but you can trigger an immediate refresh with the refresh_content
tool.
Q: What if my blog is on a platform other than Substack or Medium? A: Currently only Substack and Medium are supported; you can extend the code to add additional RSS‑compatible platforms.
Q: Do I need an API key for embeddings? A: The repository uses the default embedding model bundled with the project; if you switch to a custom model, provide the required API key via environment variables.
Q: Can I run the server without uv
?
A: Yes, you can use a standard venv
and pip install -r requirements.txt
; the Claude Desktop config only needs the absolute path to the Python interpreter that runs writer_tool.py
.
Open-Sourced Model Context Protocol (MCP) implementation that connects Claude to your Substack and Medium writing.
Writer Context Tool is an MCP server that allows Claude to access and analyze your writing from platforms like Substack and Medium. With this tool, Claude can understand the context of your published content, providing more personalized assistance with your writing.
The tool connects to your Substack/Medium blogs via their RSS feeds, fetches your posts, and permanently caches them locally. It also generates embeddings for each post, enabling semantic search to find the most relevant essays based on your queries.
When you ask Claude about your writing, it can use these individual essay resources to provide insights or help you develop new ideas based on your existing content.
git clone https://github.com/yourusername/writer-context-tool.git
cd writer-context-tool
Using uv (recommended):
# Install uv if you don't have it
curl -LsSf https://astral.sh/uv/install.sh | sh
# Create virtual environment and install dependencies
uv venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
uv pip install -r requirements.txt
Or using standard pip:
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
pip install -r requirements.txt
Copy the example configuration file:
cp config.example.json config.json
Edit config.json
with your Substack/Medium URLs:
{
"platforms": [
{
"type": "substack",
"url": "https://yourusername.substack.com",
"name": "My Substack Blog"
},
{
"type": "medium",
"url": "https://medium.com/@yourusername",
"name": "My Medium Blog"
}
],
"max_posts": 100,
"cache_duration_minutes": 10080,
"similar_posts_count": 10
}
max_posts
: Maximum number of posts to fetch from each platform (default: 100)cache_duration_minutes
: How long to cache content before refreshing (default: 1 week or 10080 minutes)similar_posts_count
: Number of most relevant posts to return when searching (default: 10)Create the Claude Desktop configuration directory:
# On macOS
mkdir -p ~/Library/Application\ Support/Claude/
Create the configuration file:
# Get the absolute path to your uv command
UV_PATH=$(which uv)
# Create the configuration
cat > ~/Library/Application\ Support/Claude/claude_desktop_config.json << EOF
{
"mcpServers": {
"writer-tool": {
"command": "${UV_PATH}",
"args": [
"--directory",
"$(pwd)",
"run",
"writer_tool.py"
]
}
}
}
EOF
Note: If you experience issues with the
uv
command, you can use the included shell script alternative:
- Make the script executable:
chmod +x run_writer_tool.sh
- Update your Claude Desktop config to use the script:
{ "mcpServers": { "writer-tool": { "command": "/absolute/path/to/run_writer_tool.sh", "args": [] } } }
Restart Claude Desktop
Once set up, you'll see individual essays available as resources in Claude Desktop. You can:
Search across your writing: Ask Claude to find relevant content
Reference specific essays: Access individual essays by clicking on them when listed in search results
Refresh content: Force a refresh of your content
The Writer Context Tool provides:
The tool implements permanent caching with these features:
If you encounter issues:
Tool doesn't appear in Claude Desktop:
No content appears:
Error with uv command:
Embedding issues:
This project is available under the MIT License.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
A basic implementation of persistent memory using a local knowledge graph. This lets Claude remember information about the user across chats.
by topoteretes
Provides dynamic memory for AI agents through modular ECL (Extract, Cognify, Load) pipelines, enabling seamless integration with graph and vector stores using minimal code.
by basicmachines-co
Enables persistent, local‑first knowledge management by allowing LLMs to read and write Markdown files during natural conversations, building a traversable knowledge graph that stays under the user’s control.
by smithery-ai
Provides read and search capabilities for Markdown notes in an Obsidian vault for Claude Desktop and other MCP clients.
by chatmcp
Summarize chat messages by querying a local chat database and returning concise overviews.
by dmayboroda
Provides on‑premises conversational retrieval‑augmented generation (RAG) with configurable Docker containers, supporting fully local execution, ChatGPT‑based custom GPTs, and Anthropic Claude integration.
by GreatScottyMac
Provides a project‑specific memory bank that stores decisions, progress, architecture, and custom data, exposing a structured knowledge graph via MCP for AI assistants and IDE tools.
by andrea9293
Provides document management and AI-powered semantic search for storing, retrieving, and querying text, markdown, and PDF files locally without external databases.
by scorzeth
Provides a local MCP server that interfaces with a running Anki instance to retrieve, create, and update flashcards through standard MCP calls.