by wheattoast11
Orchestrates a network of asynchronous agents and streaming swarms to conduct ensemble‑consensus research, automatically creating indexed PGlite databases in WebAssembly and providing semantic, hybrid, and SQL search capabilities.
OpenRouter Agents MCP Server provides a production‑ready environment for multi‑agent AI research. It coordinates planning, parallel execution, and synthesis of tasks across a fleet of LLMs, stores every interaction in a local PGlite DB, and offers hybrid BM25‑vector search, SQL querying, and resource management.
npx:
npx @terminals-tech/openrouter-agents --stdio # for IDE integration
# or daemon mode
SERVER_API_KEY=devkey npx @terminals-tech/openrouter-agents
.env (solo) or .mcp.json (team) with keys such as OPENROUTER_API_KEY, SERVER_API_KEY, and optional flags (MODE, ENSEMBLE_SIZE, PGLITE_DATA_DIR, etc.)./sse, /messages). Available slash commands include /mcp-status, /mcp-research, /mcp-search, and /mcp-query.get_job_status, cancel with cancel_job, and retrieve reports through get_report_content.AGENT, MANUAL, ALL for flexible tool exposure.ui://research/viewer, ui://knowledge/graph).Q: Do I need to run a database separately?
A: No. All data is stored in an embedded PGlite instance that runs in WebAssembly; the directory is configurable via PGLITE_DATA_DIR.
Q: Can I disable the hybrid indexer?
A: Yes. Set INDEXER_ENABLED=false in the environment or .mcp.json.
Q: How are costs managed across models?
A: The server categorizes models into HIGH_COST_MODELS, LOW_COST_MODELS, etc., and the planning step selects appropriate candidates based on the cost parameter of the request.
Q: Is streaming supported for IDE integration?
A: Absolutely. Use the --stdio mode for Cursor/VS Code MCP clients or the HTTP/SSE endpoint for external tools.
Q: What authentication is required?
A: Provide an OpenRouter API key (OPENROUTER_API_KEY) for model access and a server‑side key (SERVER_API_KEY) for client authentication.
Q: Can I run the server globally without cloning the repo?
A: Yes. The npx command pulls the package directly from npm and launches the server.
Production-ready MCP server for multi-agent AI research with OpenRouter integration. Fully compliant with MCP Specification 2025-06-18 and prepared for November 2025 spec updates.
npm install @terminals-tech/openrouter-agents
npm install -g @terminals-tech/openrouter-agents
npx @terminals-tech/openrouter-agents --stdio
# or daemon
SERVER_API_KEY=devkey npx @terminals-tech/openrouter-agents
ui://research/viewer, ui://knowledge/graph)graph_traverse, graph_path, graph_clusters, graph_pagerank@terminals-tech/core@terminals-tech/embeddings, @terminals-tech/graph, @terminals-tech/coreChangelog → | Compliance Report →
npm install
npx @terminals-tech/openrouter-agents --verify
Choose your configuration method based on your use case:
Are you working alone or with a team?
│
├─ ALONE (local development)
│ │
│ └─► Use .env file
│ • Keeps secrets out of version control
│ • Personal API keys stay private
│ • Add .env to .gitignore
│
└─ TEAM (shared project)
│
└─► Use .mcp.json file
• Commit to version control
• Use ${VAR} syntax for secrets
• Team shares same configuration
• Each member sets own env vars
Quick Decision:
.env → Solo developer, local machine, prototype.mcp.json → Team project, CI/CD, shareable configOPENROUTER_API_KEY=your_openrouter_key
SERVER_API_KEY=your_http_transport_key
SERVER_PORT=3002
# Modes (pick one; default ALL)
# AGENT = agent-only + always-on ops (ping/status/jobs)
# MANUAL = individual tools + always-on ops
# ALL = agent + individual tools + always-on ops
MODE=ALL
# Orchestration
ENSEMBLE_SIZE=2
PARALLELISM=4
# Models (override as needed) - Updated with state-of-the-art cost-effective models
PLANNING_MODEL=openai/gpt-5-chat
PLANNING_CANDIDATES=openai/gpt-5-chat,google/gemini-2.5-pro,anthropic/claude-sonnet-4
HIGH_COST_MODELS=x-ai/grok-4,openai/gpt-5-chat,google/gemini-2.5-pro,anthropic/claude-sonnet-4,morph/morph-v3-large
LOW_COST_MODELS=deepseek/deepseek-chat-v3.1,z-ai/glm-4.5v,qwen/qwen3-coder,openai/gpt-5-mini,google/gemini-2.5-flash
VERY_LOW_COST_MODELS=openai/gpt-5-nano,deepseek/deepseek-chat-v3.1
# Storage
PGLITE_DATA_DIR=./researchAgentDB
PGLITE_RELAXED_DURABILITY=true
REPORT_OUTPUT_PATH=./research_outputs/
# Indexer
INDEXER_ENABLED=true
INDEXER_AUTO_INDEX_REPORTS=true
INDEXER_AUTO_INDEX_FETCHED=true
# MCP features
MCP_ENABLE_PROMPTS=true
MCP_ENABLE_RESOURCES=true
# Prompt strategy
PROMPTS_COMPACT=true
PROMPTS_REQUIRE_URLS=true
PROMPTS_CONFIDENCE=true
node src/server/mcpServer.js --stdio
SERVER_API_KEY=$SERVER_API_KEY node src/server/mcpServer.js
$env:OPENROUTER_API_KEY='your_key'
$env:INDEXER_ENABLED='true'
node src/server/mcpServer.js --stdio
$env:OPENROUTER_API_KEY='your_key'
$env:SERVER_API_KEY='devkey'
$env:SERVER_PORT='3002'
node src/server/mcpServer.js
Dev (HTTP/SSE):
SERVER_API_KEY=devkey INDEXER_ENABLED=true node src/server/mcpServer.js
STDIO (Cursor/VS Code):
OPENROUTER_API_KEY=your_key INDEXER_ENABLED=true node src/server/mcpServer.js --stdio
You can register this server directly in MCP clients that support JSON server manifests.
Minimal examples:
{
"servers": {
"openrouter-agents": {
"command": "npx",
"args": ["@terminals-tech/openrouter-agents", "--stdio"],
"env": {
"OPENROUTER_API_KEY": "${OPENROUTER_API_KEY}",
"SERVER_API_KEY": "${SERVER_API_KEY}",
"PGLITE_DATA_DIR": "./researchAgentDB",
"INDEXER_ENABLED": "true"
}
}
}
}
{
"servers": {
"openrouter-agents": {
"url": "http://127.0.0.1:3002",
"sse": "/sse",
"messages": "/messages",
"headers": {
"Authorization": "Bearer ${SERVER_API_KEY}"
}
}
}
}
With the package installed globally (or via npx), MCP clients can spawn the server automatically. See your client's docs for where to place this JSON (e.g., ~/.config/client/mcp.json).
One-liner setup for Claude Code:
claude mcp add openrouter-agents -- npx @terminals-tech/openrouter-agents --stdio
Or interactive setup with slash commands and hooks:
npx @terminals-tech/openrouter-agents --setup-claude
| Command | Description |
|---|---|
/mcp-status |
Check server health and recent activity |
/mcp-research |
Run synchronous research query |
/mcp-async-research |
Run async research (returns job_id) |
/mcp-search |
Search the knowledge base |
/mcp-query |
Execute SQL query |
Set your API key before using:
export OPENROUTER_API_KEY="sk-or-..."
The setup creates a .mcp.json file for team-shareable configuration:
{
"mcpServers": {
"openrouter-agents": {
"command": "npx",
"args": ["@terminals-tech/openrouter-agents", "--stdio"],
"env": {
"OPENROUTER_API_KEY": "${OPENROUTER_API_KEY}",
"INDEXER_ENABLED": "true",
"MCP_ENABLE_TASKS": "true"
}
}
}
}
After setup, verify the connection:
/mcp-status
Or use the tools directly:
ping {} // → {"pong":true}
get_server_status {} // → Full health check
list_tools {} // → Available tools
See .claude/README.md for detailed configuration options.
ping, get_server_status, job_status, get_job_status, cancel_jobagent (single entrypoint for research / follow_up / retrieve / query)submit_research (async), conduct_research (sync/stream), research_follow_up, search (hybrid), retrieve (index/sql), query (SELECT), get_report_content, list_research_historyget_job_status, cancel_jobsearch (hybrid BM25+vector with optional LLM rerank), retrieve (index/sql wrapper)query (SELECT‑only, optional explain)get_past_research, list_research_history, get_report_contentbackup_db (tar.gz), export_reports, import_reports, db_health, reindex_vectorslist_modelssearch_web, fetch_urlindex_texts, index_url, search_index, index_statusUse tool_patterns resource to view JSON recipes describing effective chaining, e.g.:
/jobs/:id/events, then get report contentNotes
PGLITE_DATA_DIR (default ./researchAgentDB). Backups are tarballs in ./backups.list_models to discover current provider capabilities and ids.See docs/diagram-architecture.mmd (Mermaid). Render to SVG with Mermaid CLI if installed:
npx @mermaid-js/mermaid-cli -i docs/diagram-architecture.mmd -o docs/diagram-architecture.svg
Or use the script:
npm run gen:diagram
If the image doesn’t render in your viewer, open docs/diagram-architecture-branded.svg directly.
How it differs from typical “agent chains”:
“Give me an executive briefing on MCP status as of July 2025.”
“Find vision‑capable models and route images gracefully.”
/models discovered and filtered, router template generated, fallback to text models.“Compare orchestration patterns for bounded parallelism.”
node src/server/mcpServer.js --stdio.planning_prompt, synthesis_prompt) directly in Cursor to scaffold tasks.npm run stdionpm startnpx @terminals-tech/openrouter-agents --stdionpm run gen:exampleslist_models { refresh:false }submit_research { q:"<query>", cost:"low", aud:"intermediate", fmt:"report", src:true }get_job_status { job_id:"..." }, cancel: cancel_job { job_id:"..." }search { q:"<query>", k:10, scope:"both" }query { sql:"SELECT ... WHERE id = $1", params:[1], explain:true }get_past_research { query:"<query>", limit:5 }index_url { url:"https://..." }http://localhost:3002/ui to stream job events (SSE).@terminals-tech/openrouter-agentsopenrouter-agentsInstall and run without cloning:
npx @terminals-tech/openrouter-agents --stdio
# or daemon
SERVER_API_KEY=your_key npx @terminals-tech/openrouter-agents
npm login
npm version patch -m "chore(release): %s"
git push --follow-tags
npm publish --access public --provenance
[Unverified].rate_research_report { rating, comment } stored to DB; drives follow‑ups.export_reports + backup_db capture artifacts for audit.npm run gen:exampleslist_research_history, get_report_content {reportId}rate_research_report { reportId, rating:1..5, comment }reindex_vectors, index_status, search_index { query }docs/diagram-architecture-branded.svg (logo links to https://terminals.tech).Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by reading-plus-ai
Enables comprehensive, well‑cited research reports by elaborating questions, generating sub‑questions, performing web searches, analyzing content, and formatting findings into structured artifacts.
by takashiishida
Fetches arXiv LaTeX sources and provides them via MCP for LLM clients to accurately interpret mathematical content in scientific papers.
by prashalruchiranga
Interact with the arXiv API using natural language to retrieve article metadata, download PDFs, perform searches, and load full texts into large language model contexts.
by drAbreu
Provides author disambiguation, institution resolution, work retrieval, citation analysis, and ORCID matching via the OpenAlex API, delivering streamlined, structured responses optimized for AI‑agent consumption.
by ali-kh7
Aggregates web data via Tavily's Search and Crawl APIs, structures the results for LLM-friendly markdown generation.
by universal-mcp
Provides a standardized API to interact with Semantic Scholar's tools and services, enabling unified access through the Universal MCP framework.
by netdata
Delivers real‑time, per‑second infrastructure monitoring with zero‑configuration agents, on‑edge machine‑learning anomaly detection, and built‑in dashboards.
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process.
{
"mcpServers": {
"openrouter-agents": {
"command": "npx",
"args": [
"@terminals-tech/openrouter-agents",
"--stdio"
],
"env": {
"OPENROUTER_API_KEY": "<YOUR_OPENROUTER_API_KEY>",
"SERVER_API_KEY": "<YOUR_SERVER_API_KEY>"
}
}
}
}claude mcp add openrouter-agents npx @terminals-tech/openrouter-agents --stdio