by ConechoAI
Provides access to OpenAI's websearch capability through the Model Context Protocol, enabling AI assistants to retrieve up-to-date web information during conversations.
Enables AI assistants to perform web searches using OpenAI's websearch endpoint, delivering fresh information that may not be present in the model's training data.
uvx or standard pip).web_search tool from the assistant, providing required arguments such as type, search_context_size, and optional user_location.npx @modelcontextprotocol/inspector) to debug if needed.web_search tool with required arguments (type, search_context_size) and optional user location metadata.low, medium (default), high).uvx (OPENAI_API_KEY=sk-xxxx uv run --with uv --with openai-websearch-mcp openai-websearch-mcp-install).uvx and pip.Q: Do I need an OpenAI API key?
A: Yes, the server requires a valid OPENAI_API_KEY environment variable.
Q: Which editors or platforms are supported? A: Claude.app, Zed editor, and any client that implements the Model Context Protocol.
Q: How do I debug the server?
A: Use the MCP inspector, e.g., npx @modelcontextprotocol/inspector uvx openai-websearch-mcp.
Q: Can I customize the search context size?
A: Yes, set search_context_size to low, medium, or high when calling the tool.
Q: Is user location required? A: No, it is optional but can improve search relevance when provided.
An advanced MCP server that provides intelligent web search capabilities using OpenAI's reasoning models. Perfect for AI assistants that need up-to-date information with smart reasoning capabilities.
reasoning_effort defaults based on use caseOPENAI_API_KEY=sk-xxxx uvx --with openai-websearch-mcp openai-websearch-mcp-install
Replace sk-xxxx with your OpenAI API key from the OpenAI Platform.
Add to your claude_desktop_config.json:
{
"mcpServers": {
"openai-websearch-mcp": {
"command": "uvx",
"args": ["openai-websearch-mcp"],
"env": {
"OPENAI_API_KEY": "your-api-key-here",
"OPENAI_DEFAULT_MODEL": "gpt-5-mini"
}
}
}
}
Add to your MCP settings in Cursor:
Cmd/Ctrl + ,){
"mcpServers": {
"openai-websearch-mcp": {
"command": "uvx",
"args": ["openai-websearch-mcp"],
"env": {
"OPENAI_API_KEY": "your-api-key-here",
"OPENAI_DEFAULT_MODEL": "gpt-5-mini"
}
}
}
}
Claude Code automatically detects MCP servers configured for Claude Desktop. Use the same configuration as above for Claude Desktop.
For local testing, use the absolute path to your virtual environment:
{
"mcpServers": {
"openai-websearch-mcp": {
"command": "/path/to/your/project/.venv/bin/python",
"args": ["-m", "openai_websearch_mcp"],
"env": {
"OPENAI_API_KEY": "your-api-key-here",
"OPENAI_DEFAULT_MODEL": "gpt-5-mini",
"PYTHONPATH": "/path/to/your/project/src"
}
}
}
}
openai_web_searchIntelligent web search with reasoning model support.
| Parameter | Type | Description | Default |
|---|---|---|---|
input |
string |
The search query or question to search for | Required |
model |
string |
AI model to use. Supports gpt-4o, gpt-4o-mini, gpt-5, gpt-5-mini, gpt-5-nano, o3, o4-mini | gpt-5-mini |
reasoning_effort |
string |
Reasoning effort level: low, medium, high, minimal | Smart default |
type |
string |
Web search API version | web_search_preview |
search_context_size |
string |
Context amount: low, medium, high | medium |
user_location |
object |
Optional location for localized results | null |
Once configured, simply ask your AI assistant to search for information using natural language:
"Search for the latest developments in AI reasoning models using openai_web_search"
"Use openai_web_search with gpt-5 and high reasoning effort to provide a comprehensive analysis of quantum computing breakthroughs"
"Search for local tech meetups in San Francisco this week using openai_web_search"
The AI assistant will automatically use the openai_web_search tool with appropriate parameters based on your request.
gpt-5-mini with reasoning_effort: "low"gpt-5 with reasoning_effort: "medium" or "high"| Model | Reasoning | Default Effort | Best For |
|---|---|---|---|
gpt-4o |
❌ | N/A | Standard search |
gpt-4o-mini |
❌ | N/A | Basic queries |
gpt-5-mini |
✅ | low |
Fast iterations |
gpt-5 |
✅ | medium |
Deep research |
gpt-5-nano |
✅ | medium |
Balanced approach |
o3 |
✅ | medium |
Advanced reasoning |
o4-mini |
✅ | medium |
Efficient reasoning |
# Install and run directly
uvx openai-websearch-mcp
# Or install globally
uvx install openai-websearch-mcp
# Install from PyPI
pip install openai-websearch-mcp
# Run the server
python -m openai_websearch_mcp
# Clone the repository
git clone https://github.com/yourusername/openai-websearch-mcp.git
cd openai-websearch-mcp
# Install dependencies
uv sync
# Run in development mode
uv run python -m openai_websearch_mcp
# Clone and setup
git clone https://github.com/yourusername/openai-websearch-mcp.git
cd openai-websearch-mcp
# Create virtual environment and install dependencies
uv sync
# Run tests
uv run python -m pytest
# Install in development mode
uv pip install -e .
| Variable | Description | Default |
|---|---|---|
OPENAI_API_KEY |
Your OpenAI API key | Required |
OPENAI_DEFAULT_MODEL |
Default model to use | gpt-5-mini |
# For uvx installations
npx @modelcontextprotocol/inspector uvx openai-websearch-mcp
# For pip installations
npx @modelcontextprotocol/inspector python -m openai_websearch_mcp
Issue: "Unsupported parameter: 'reasoning.effort'" Solution: This occurs when using non-reasoning models (gpt-4o, gpt-4o-mini) with reasoning_effort parameter. The server automatically handles this by only applying reasoning parameters to compatible models.
Issue: "No module named 'openai_websearch_mcp'" Solution: Ensure you've installed the package correctly and your Python path includes the package location.
This project is licensed under the MIT License - see the LICENSE file for details.
Co-Authored-By: Claude noreply@anthropic.com
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by exa-labs
Provides real-time web search capabilities to AI assistants via a Model Context Protocol server, enabling safe and controlled access to the Exa AI Search API.
by perplexityai
Enables Claude and other MCP‑compatible applications to perform real‑time web searches through the Perplexity (Sonar) API without leaving the MCP ecosystem.
by MicrosoftDocs
Provides semantic search and fetch capabilities for Microsoft official documentation, returning content in markdown format via a lightweight streamable HTTP transport for AI agents and development tools.
by elastic
Enables natural‑language interaction with Elasticsearch indices via the Model Context Protocol, exposing tools for listing indices, fetching mappings, performing searches, running ES|QL queries, and retrieving shard information.
by graphlit
Enables integration between MCP clients and the Graphlit platform, providing ingestion, extraction, retrieval, and RAG capabilities across a wide range of data sources and connectors.
by ihor-sokoliuk
Provides web search capabilities via the SearXNG API, exposing them through an MCP server for seamless integration with AI agents and tools.
by mamertofabian
Fast cross‑platform file searching leveraging the Everything SDK on Windows, Spotlight on macOS, and locate/plocate on Linux.
by spences10
Provides unified access to multiple search engines, AI response tools, and content processing services through a single Model Context Protocol server.
by cr7258
Provides Elasticsearch and OpenSearch interaction via Model Context Protocol, enabling document search, index management, cluster monitoring, and alias operations.
{
"mcpServers": {
"openai-websearch-mcp": {
"command": "python",
"args": [
"-m",
"openai_websearch_mcp"
],
"env": {
"OPENAI_API_KEY": "<YOUR_API_KEY>"
}
}
}
}claude mcp add openai-websearch-mcp python -m openai_websearch_mcp