by pyroprompts
Provides a STDIO MCP server that forwards requests to a Streamable HTTP MCP server, enabling any MCP client with STDIO support to communicate with Streamable HTTP back‑ends instantly.
A lightweight bridge that runs as a STDIO‑based MCP server and relays all MCP traffic to a configured Streamable HTTP MCP endpoint. By deploying this adapter, existing MCP clients (e.g., Claude Desktop, LibreChat) gain immediate compatibility with the new Streamable HTTP transport without any code changes.
npx
(no global install required).URI
(the HTTP endpoint) and optionally MCP_NAME
and BEARER_TOKEN
.MCP_NAME
allows multiple adapters side‑by‑side.npx
or build locally.Q: Do I need to build the project before using it?
A: No. The package can be executed directly with npx
. Building is only required if you want to run a locally compiled version.
Q: Is the MCP_NAME
environment variable mandatory?
A: It is optional for a single adapter. When you run multiple adapters in the same client, each must have a unique MCP_NAME
.
Q: How is authentication handled?
A: Provide a BEARER_TOKEN
env var; the adapter adds it as an Authorization: Bearer <token>
header on every HTTP request.
Q: Can I use this with non‑Node clients? A: Yes. As long as the client can spawn a process that communicates over STDIO, it can use the adapter regardless of the client language.
Q: Where can I debug communication?
A: Run npm run inspector
to start the MCP Inspector, which gives a browser UI showing the request/response flow.
Integrate any MCP Client that has STDIO MCP Server Support (most do) with the Streamable HTTP MCP Servers
If you have a Streamable HTTP MCP Server, forking this, hardcoding the URI and publishing will give you a repo that you can point to, add to directories and let people use via STDIO.
Note: This has similarities to the mcp-remote package
The MCP Spec to add a Streamable HTTP Transport landed end of March 2025 and as of the end of April, no clients have adopted support. The typescript-sdk has merged the code, but not released. The Python SDK is still in development to support it. The Inspector supports it, but that's it.
This leaves developers in an awkward position. Develop the MCP Server using STDIO or SSE (deprecated) so it works with clients or develop with Streamable HTTP transport, but nobody can use it.
I (ferrants) want to start integrating the Streamable HTTP MCP Servers beyond just the inspector, so I need a way to connect them to clients and LLMs right away!
This package aims to bridge the cap by being a STDIO MCP Server that relays to your Streamable HTTP MCP Server. This makes all MCP Clients support Streamable HTTP right away. And now developers can develop the Streamable HTTP MCP Servers and provide an installation method.
To add OpenAI to Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
URI
: The URL of the Streamable HTTP MCP Server. This is required.MCP_NAME
: The name of the MCP Server. This is optional. If you configure multiple, this is required so they do not have the same names.BEARER_TOKEN
: The Bearer token for the Streamable HTTP MCP Server. This is optional. If specified, this will be sent along in the Authorization header.You can use it via npx
in your Claude Desktop configuration like this:
{
"mcpServers": {
"my-saas-app-development": {
"command": "npx",
"args": [
"@pyroprompts/mcp-stdio-to-streamable-http-adapter"
],
"env": {
"URI": "http://localhost:3002/mcp",
"MCP_NAME": "local-custom-streamable-http-adapter"
}
}
}
}
Or, if you clone the repo, you can build and use in your Claude Desktop configuration like this:
{
"mcpServers": {
"my-saas-app-development": {
"command": "node",
"args": [
"/path/to/mcp-stdio-to-streamable-http-adapter/build/index.js"
],
"env": {
"URI": "http://localhost:3002/mcp",
"MCP_NAME": "local-custom-streamable-http-adapter"
}
}
}
}
You can add multiple providers by referencing the same MCP server multiple times, but with different env arguments:
{
"mcpServers": {
"my-saas-app-development": {
"command": "node",
"args": [
"/path/to/mcp-stdio-to-streamable-http-adapter/build/index.js"
],
"env": {
"URI": "http://localhost:3002/mcp",
"MCP_NAME": "local-custom-streamable-http-adapter"
}
},
"pyroprompts": {
"command": "node",
"args": [
"/path/to/mcp-stdio-to-streamable-http-adapter/build/index.js"
],
"env": {
"URI": "https://api.pyroprompts.com/mcp",
"MCP_NAME": "pyroprompts",
"BEARER_TOKEN": "abcdefg"
}
}
}
}
With these three, you'll see a tool for each in the Claude Desktop Home:
And then you can chat with other LLMs and it shows in chat like this:
Or, configure in LibreChat like:
my-saas-app-development:
type: stdio
command: npx
args:
- -y
- @pyroprompts/mcp-stdio-to-streamable-http-adapter
env:
URI: "http://localhost:3002/mcp",
MCP_NAME: "my-custom-saas-app"
PATH: '/usr/local/bin:/usr/bin:/bin'
And it shows in LibreChat:
Install dependencies:
npm install
Build the server:
npm run build
For development with auto-rebuild:
npm run watch
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
MCPSTREAMABLEADAPTER
for 20 free automation credits on Pyroprompts.Please log in to share your review and rating for this MCP.
{ "mcpServers": { "streamable-http-adapter": { "command": "npx", "args": [ "-y", "@pyroprompts/mcp-stdio-to-streamable-http-adapter" ], "env": { "URI": "http://localhost:3002/mcp", "MCP_NAME": "local-custom-streamable-http-adapter", "BEARER_TOKEN": "<YOUR_BEARER_TOKEN>" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.