by kanad13
Provides MD5 and SHA‑256 hashing capabilities via the Model Context Protocol, allowing LLM‑driven workflows to compute cryptographic hashes without leaving the chat environment.
A lightweight MCP server exposing two tools—calculate_md5
and calculate_sha256
—that accept arbitrary text and return the corresponding MD5 or SHA‑256 hash. The server can be run either as a Docker container or directly from a Python environment and is designed to integrate with MCP‑compatible clients such as VS Code Copilot Chat, Claude Desktop, and other LLM interfaces.
Docker (recommended)
docker pull kunalpathak13/hashing-mcp-server:latest
Configure your MCP client to launch the container with docker run -i --rm kunalpathak13/hashing-mcp-server:latest
.
Python Direct Execution
uv venv && source .venv/bin/activate
uv pip install hashing-mcp-server
hashing-mcp-server # runs the server
Point the client to the absolute path of the installed script.
Interaction
calculate_md5
, calculate_sha256
).Q: Which Python version is required? A: Python 3.13 or newer.
Q: Do I need Docker if I use the Python package? A: No. Docker is only the recommended deployment method for simplicity.
Q: Can I extend the server with additional tools? A: Yes. Fork the repository, add new tool functions, and reinstall in editable mode.
Q: How are credentials handled? A: The server itself does not require API keys; any environment variables needed for custom extensions can be added by the user.
A Model Context Protocol (MCP) server for MD5 and SHA-256 hashing. This server enables LLMs to process cryptographic requests efficiently.
The server offers 2 tools:
calculate_md5
: Computes the MD5 hash of a given text.calculate_sha256
: Computes the SHA-256 hash of a given text.The server is designed to be used with MCP clients like VS Code Copilot Chat, Claude for Desktop, and other LLM interfaces that support the Model Context Protocol.
If you are new to the concept of Model Context Protocol (MCP), then you can use these resources:
hashing-mcp-server
package?
hashing-mcp-server
?
The gif below shows how the MCP server processes requests and returns the corresponding cryptographic hashes.
I have used Claude Desktop as an example, but it works equally well with other MCP clients like VSCode.
venv
, uv
).uv
(recommended) or pip
, Docker (optional, for testing build).This is the simplest way to run the server without managing Python environments directly.
1. Get the Docker Image:
Pull from Docker Hub (Easiest):
docker pull kunalpathak13/hashing-mcp-server:latest
2. Configure Your MCP Client:
Configure your client to use docker run
.
VS Code (settings.json
):
// In your VS Code settings.json (User or Workspace)
"mcp": {
"servers": {
"hashing-docker": { // Use a distinct name if needed
"command": "docker",
"args": [
"run",
"-i", // Keep STDIN open for communication
"--rm", // Clean up container after exit
"kunalpathak13/hashing-mcp-server:latest" // Change the tag to your version if needed e.g. "hashing-mcp-server:X.Y.Z"
]
}
}
}
Claude Desktop (claude_desktop_config.json
):
{
"mcpServers": {
"hashing-docker": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"kunalpathak13/hashing-mcp-server:latest" // Change the tag to your version if needed e.g. "hashing-mcp-server:X.Y.Z"
]
}
}
}
Other Clients: Adapt according to their docs, using docker
as the command and run -i --rm IMAGE_NAME
as arguments. Refer to their official documentation for precise configuration steps:
3. Test the Integration:
Once configured, interact with your MCP client (VS Code Chat, Claude Desktop, etc.). Ask questions designed to trigger the hashing tools:
The client should start the Docker container in the background using the command you provided, send the request, receive the hash result, and display it.
Use this method if you prefer not to use Docker or for development purposes.
1. Set Up Environment & Install:
# Create a dedicated directory and navigate into it
mkdir my_mcp_setup && cd my_mcp_setup
# --- Create & Activate Virtual Environment (Choose ONE method) ---
# Method A: Using uv (Recommended):
uv venv
source .venv/bin/activate # Linux/macOS
# .venv\Scripts\activate # Windows
# Method B: Using standard venv:
# python -m venv .venv
# source .venv/bin/activate # Linux/macOS
# .venv\Scripts\activate # Windows
# ---
# --- Install the package (within the active venv, choose ONE method) ---
# Method A: Using uv:
uv pip install hashing-mcp-server
# Method B: Using pip:
# pip install hashing-mcp-server
# ---
2. Find the Executable Path:
With the virtual environment active, find the full, absolute path to the installed script:
# On Linux/macOS:
which hashing-mcp-server
# Example Output: /home/user/my_mcp_setup/.venv/bin/hashing-mcp-server
# On Windows (Command Prompt/PowerShell):
where hashing-mcp-server
# Example Output: C:\Users\User\my_mcp_setup\.venv\Scripts\hashing-mcp-server.exe
Copy the full path displayed in the output.
3. Configure Your MCP Client:
Use the absolute path you copied in the client configuration.
VS Code (settings.json
):
// In your VS Code settings.json (User or Workspace)
"mcp": {
"servers": {
// You can name this key anything, e.g., "hasher" or "cryptoTools"
"hashing": {
// Paste the full, absolute path you copied here:
"command": "/full/path/to/your/virtualenv/bin/hashing-mcp-server"
// No 'args' needed when running the installed script directly
}
}
}
(Replace the example path with your actual path)
Claude Desktop (claude_desktop_config.json
):
{
"mcpServers": {
"hashing": {
// Paste the full, absolute path you copied here:
"command": "/full/path/to/your/virtualenv/bin/hashing-mcp-server"
}
}
}
(Replace the example path with your actual path)
Other Clients: Follow their specific instructions, providing the full absolute path found in step 2 as the command.
4. Test the Integration:
Once configured, interact with your MCP client (VS Code Chat, Claude Desktop, etc.). Ask questions designed to trigger the hashing tools: - "Use the calculate_md5 tool on 'hello world'." - "Compute the SHA256 hash for the text 'MCP rocks'."
The client should start the server script using the absolute path you provided, send the request, receive the hash result, and display it.
Follow these steps if you want to modify the server code or contribute.
1. Clone the Repository:
git clone https://github.com/kanad13/MCP-Server-for-Hashing.git
cd MCP-Server-for-Hashing
2. Set Up Development Environment:
# Create & Activate Virtual Environment (using uv recommended)
uv venv
source .venv/bin/activate # Linux/macOS
# .venv\Scripts\activate # Windows
# Install in editable mode with development dependencies
uv pip install -e ".[dev]"
(This installs the package such that code changes in src/
take effect immediately without reinstalling. It also installs tools defined in [project.optional-dependencies.dev]
like pytest
)
3. Running Locally During Development: Ensure your development virtual environment is active. You can run the server using:
# Run the installed script (available due to -e flag)
hashing-mcp-server
Or execute the module directly:
python -m hashing_mcp.cli
(You might temporarily configure your MCP client to point to the executable path within this specific development .venv
for integrated testing)
4. Running Tests: Ensure your development virtual environment is active:
pytest
(For project maintainers)
The release process (building, testing, tagging, pushing to PyPI and Docker Hub) is automated by the build_and_push.sh
script located in the repository root.
Prerequisites for Running the Script:
source .venv/bin/activate
or .venv\Scripts\activate
).uv
(or pip
), twine
, git
, docker
.docker login
.TWINE_USERNAME=__token__
and TWINE_PASSWORD=pypi-...
environment variables or ~/.pypirc
.origin
by default) and the Docker Hub repository (kunalpathak13/hashing-mcp-server
by default).Release Steps:
version
field in pyproject.toml
is updated to the correct new version number.chmod +x build_and_push.sh
source .venv/bin/activate
(or equivalent)../build_and_push.sh
latest
), push Docker images, create Git tag, push Git tag.This project is licensed under the MIT License - see the LICENSE file for details.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.