by Pearl-com
Exposes Pearl's AI and expert services via a standardized Model Context Protocol interface, enabling MCP clients to interact with AI assistants and human experts.
Pearl MCP Server provides a Model Context Protocol (MCP) endpoint that bridges Pearl's advanced AI assistants and human expert network with any MCP‑compatible client (e.g., Claude Desktop, Cursor). It abstracts the underlying Pearl API behind a simple transport layer (stdio or Server‑Sent Events) so applications can request AI‑only answers, AI‑augmented expert assistance, or direct human expert help.
.env
file with your PEARL_API_KEY
.pearl-mcp-server --api-key <your-key>
pearl-mcp-server --api-key <your-key> --transport sse --port 8000
pearl-mcp-server
with the API key.mcp-remote
(Node.js) for clients that require an HTTP endpoint.ask_pearl_ai
, ask_pearl_expert
, ask_expert
, get_conversation_status
, and get_conversation_history
.Q: Do I need a Pearl account to run the server? A: Yes, you must obtain a Pearl API key from the Pearl contact page.
Q: Can I run the server publicly?
A: The repository supplies a local implementation; Pearl also offers a hosted MCP endpoint (https://mcp.pearl.com/mcp
). For public exposure you would need to host the Python server behind your own HTTPS reverse proxy.
Q: Which transport should I choose? A: Use stdio for local desktop clients; choose SSE when a network‑accessible HTTP endpoint is required.
Q: How are expert categories selected? A: The Pearl API analyzes the query context and automatically routes the request to the most relevant expert domain (e.g., medical, legal, technical).
Q: Is the API key exposed to clients? A: No. The server handles the API key internally; clients only interact with the MCP layer.
A Model Context Protocol (MCP) server implementation that exposes Pearl's AI and Expert services through a standardized interface. This server allows MCP clients like Claude Desktop, Cursor, and other MCP-compatible applications to interact with Pearl's advanced AI assistants and human experts.
git clone https://github.com/Pearl-com/pearl_mcp_server.git
cd pearl_mcp_server
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
pip install -e .
.env
file in the src directory:PEARL_API_KEY=your-api-key-here
Start the server using either stdio (default) or SSE transport:
# Using stdio transport (default)
pearl-mcp-server --api-key your-api-key
# Using SSE transport on custom port
pearl-mcp-server --api-key your-api-key --transport sse --port 8000
Pearl provides a hosted MCP server at:
https://mcp.pearl.com/mcp
This can be used directly with any MCP client without installing the Python application locally.
The server provides the following tools:
ask_pearl_ai
question
: The user's querychat_history
(optional): Previous conversation contextsession_id
(optional): For continuing conversationsask_pearl_expert
ask_expert
get_conversation_status
session_id
get_conversation_history
session_id
Pearl's MCP server provides access to a wide range of expert categories. The appropriate expert category is automatically determined by Pearl's API based on the context of your query, ensuring you're connected with the most relevant expert for your needs.
Here are the main categories of expertise available:
Medical & Healthcare
Legal & Financial
Technical & Professional
Education & Career
Lifestyle & Personal
Each expert category can be accessed through the ask_expert
or ask_pearl_expert
tools. You don't need to specify the category - simply describe your question or problem, and Pearl's AI will automatically route your request to the most appropriate expert type based on the context.
For connecting to a local MCP server using stdio transport, add the following configuration to your MCP client:
{
"pearl-mcp-server": {
"type": "stdio",
"command": "pearl-mcp-server",
"args": ["--api-key", "your-api-key"],
"env": {
"PEARL_API_KEY": "Your Pearl Api Key"
}
}
}
Some MCP clients don't support direct connection to remote MCP servers. For these clients, you can use the mcp-remote
package as a bridge:
Prerequisites:
Configuration for remote server:
{
"mcpServers": {
"pearl-remote": {
"command": "npx",
"args": [
"mcp-remote",
"https://mcp.pearl.com/sse"
]
}
}
}
Configuration file locations:
%APPDATA%\Claude\claude_desktop_config.json
~/Library/Application Support/Claude/claude_desktop_config.json
~/.cursor/mcp.json
~/.codeium/windsurf/mcp_config.json
Additional Options:
@latest
to npx command"args": ["mcp-remote@latest", "https://mcp.pearl.com/sse"]
Troubleshooting:
rm -rf ~/.mcp-auth
Get-Content "$env:APPDATA\Claude\Logs\mcp.log" -Wait -Tail 20
tail -n 20 -F ~/Library/Logs/Claude/mcp*.log
npx mcp-remote-client https://mcp.pearl.com/sse
import asyncio
from mcp.client.session import ClientSession
from mcp.client.stdio import StdioServerParameters, stdio_client
async def main():
# For stdio transport
async with stdio_client(
StdioServerParameters(command="pearl-mcp-server", args=["--api-key", "your-api-key"])
) as (read, write):
async with ClientSession(read, write) as session:
await session.initialize()
# List available tools
tools = await session.list_tools()
print(tools)
# Call Pearl AI
result = await session.call_tool(
"ask_pearl_ai",
{
"question": "What is MCP?",
"session_id": "optional-session-id"
}
)
print(result)
asyncio.run(main())
To obtain a Pearl API key for using this server:
Keep your API key secure and never commit it to version control.
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "pearl-remote": { "command": "npx", "args": [ "mcp-remote", "https://mcp.pearl.com/sse" ], "env": { "API_KEY": "<YOUR_API_KEY>" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process.
by danny-avila
Provides a self‑hosted ChatGPT‑style interface supporting numerous AI models, agents, code interpreter, image generation, multimodal interactions, and secure multi‑user authentication.
by block
Automates engineering tasks on local machines, executing code, building projects, debugging, orchestrating workflows, and interacting with external APIs using any LLM.
by RooCodeInc
Provides an autonomous AI coding partner inside the editor that can understand natural language, manipulate files, run commands, browse the web, and be customized via modes and instructions.
by pydantic
A Python framework that enables seamless integration of Pydantic validation with large language models, providing type‑safe agent construction, dependency injection, and structured output handling.
by lastmile-ai
Build effective agents using Model Context Protocol and simple, composable workflow patterns.
by mcp-use
A Python SDK that simplifies interaction with MCP servers and enables developers to create custom agents with tool‑calling capabilities.
by nanbingxyz
A cross‑platform desktop AI assistant that connects to major LLM providers, supports a local knowledge base, and enables tool integration via MCP servers.
by gptme
Provides a personal AI assistant that runs directly in the terminal, capable of executing code, manipulating files, browsing the web, using vision, and interfacing with various LLM providers.