by mavam
Provides a thin bridge that lets the pi CLI discover, describe, and call MCP tools through a single stable `mcporter` proxy command.
Pi Mcporter is a pi extension that exposes MCP tools via one stable CLI wrapper (mcporter). It keeps the pi experience CLI‑first while allowing automatic selection of the appropriate MCP tool for tasks such as querying Linear, reading Slack channels, or summarizing Notion pages.
npm install -g mcporter
npx mcporter list # verify servers are visible
pi install npm:pi-mcporter
Or try it without installing:
pi -e npm:pi-mcporter
search → describe → call workflow using the mcporter tool.~/.pi/agent/mcporter.json to set custom MCPorter config path, default timeout, and discovery mode (lazy or preload).mcporter tool is exposed, reducing context switching.search, describe, and call actions.lazy (metadata on demand) or preload (metadata pre‑loaded for faster calls).Ctrl+O.mcporter proxy.search for the keyword, then a describe to fetch the schema, and finally a call with the required arguments.npx mcporter list and npx mcporter list <server> to verify the selector names, then authenticate with npx mcporter auth <server> if needed.timeoutMs in ~/.pi/agent/mcporter.json or export MCPORTER_TIMEOUT_MS.lazy and preload mode?
A: lazy loads tool metadata only when required; preload loads it before the agent starts, allowing pi to skip the discovery steps for faster execution.Use MCP tools from pi through one stable tool (mcporter), powered by MCPorter.
gh, git, kubectl, aws, etc.).mcporter when it adds clear value (for example: Linear, Slack, hosted auth-heavy integrations, cross-tool workflows).mcporter tool instead of exposing many MCP toolssearch), schema help (describe), and execution (call)You need MCPorter installed and configured with at least one MCP server:
npm install -g mcporter
npx mcporter list # verify your servers are visible
Install as a pi package:
pi install npm:pi-mcporter
Try it once without installing:
pi -e npm:pi-mcporter
npx mcporter list
pi
What are my open Linear issues this sprint?Catch me up on #engineering in Slack from today.Find the onboarding runbook in Notion and summarize the setup steps.The mcporter tool has three actions that map to a natural discovery → execution workflow.
search — find tools by keywordUse when you don't know the exact server or tool name.
{ "action": "search", "query": "linear issue", "limit": 5 }
Returns matching selectors with short descriptions:
linear.create_issue — Create a new issue in a Linear team
linear.list_issues — List issues matching a filter
describe — get the full schema for a toolUse when you know the selector but need to see its required parameters before calling.
{ "action": "describe", "selector": "linear.create_issue" }
Returns the full JSON Schema for the tool's input, including required vs. optional fields and their types.
call — invoke a toolUse once you know the selector and its schema.
{
"action": "call",
"selector": "linear.create_issue",
"args": { "title": "Fix login bug", "teamId": "TEAM-1", "priority": 2 }
}
For arguments that are awkward to express as nested JSON, you can pass them as a JSON string via argsJson instead of args.
search "linear issue" → discover: linear.create_issue
describe linear.create_issue → learn required fields: title, teamId
call linear.create_issue → execute with those fields
In practice pi follows this pattern automatically. With mode: "preload" the catalog is already warm at agent start, so pi can often skip search/describe and jump straight to call.
Tool name: mcporter
action: "search" | "describe" | "call"selector?: "server.tool" (required for describe and call)query?: free-text query for searchlimit?: result limit (default 20, max 100)args?: object arguments for callargsJson?: JSON-object-string fallback for calltimeoutMs?: per-call timeout overrideConfigure the extension in ~/.pi/agent/mcporter.json:
{
"configPath": "/absolute/path/to/mcporter.json",
"timeoutMs": 30000,
"mode": "lazy"
}
MCPORTER_CONFIG=/absolute/path/to/mcporter.json still overrides configPath from the settings file.configPath: optional explicit MCPorter config path. If omitted, MCPorter uses its normal default resolution.timeoutMs: optional default call timeout in milliseconds. Tool-level timeoutMs still overrides this per call.mode: optional default MCP tool visibility mode.
lazy: only the stable mcporter proxy tool is visible and MCP metadata loads on demandpreload: still only exposes mcporter, but preloads MCP tool metadata before agent start so the agent can skip unnecessary discovery more oftenLegacy extension flags --mcporter-config and --mcporter-timeout-ms are no longer supported. Use ~/.pi/agent/mcporter.json, MCPORTER_CONFIG, and per-call timeoutMs instead.
Tool output follows pi's native expand/collapse behavior:
app.tools.expand keybinding (default Ctrl+O) to toggle expansionnpx mcporter list and npx mcporter list <server> to verify names.npx mcporter auth <server>.timeoutMs in ~/.pi/agent/mcporter.json or override timeoutMs per tool call.configPath in ~/.pi/agent/mcporter.json or export MCPORTER_CONFIG=<path>.Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.