by hyperbrowserai
Provides tools to scrape, extract structured data, crawl webpages, and access general‑purpose browser agents such as OpenAI CUA, Anthropic Claude Computer Use, and lightweight Browser Use via a Model Context Protocol server.
Hyperbrowser MCP Server delivers a set of utilities for interacting with the web programmatically. It can fetch and format page content, traverse linked pages, convert messy HTML into structured JSON, perform Bing searches, and invoke powerful browser‑automation agents (OpenAI CUA, Claude Computer Use, and a generic Browser Use agent). All capabilities are exposed through the Model Context Protocol, allowing AI models to call them as remote functions.
npx hyperbrowser-mcp <YOUR-HYPERBROWSER-API-KEY>
hyperbrowser
server, using npx
as the command and supplying the API key via an environment variable.scrape_webpage
, crawl_webpages
, search_with_bing
, openai_computer_use_agent
) from the client’s prompt or programmatic API.npx
command; supports manual dev setup via npm
/yarn
.Q: Do I need a Hyperbrowser account? A: Yes, an API key from Hyperbrowser is required to authenticate requests.
Q: Can I run the server locally for development?
A: Absolutely. Clone the repo, run npm install && npm run build
, then start with node dist/server.js
.
Q: Which environment variable holds the API key?
A: HYPERBROWSER_API_KEY
(or the generic placeholder API_KEY
in configuration templates).
Q: Is the server compatible with other MCP clients? A: The server follows the standard Model Context Protocol, so any MCP‑compatible client can call its methods.
Q: How are profiles managed?
A: Use the create_profile
, list_profiles
, and delete_profile
tools to maintain persistent browsing contexts.
This is Hyperbrowser's Model Context Protocol (MCP) Server. It provides various tools to scrape, extract structured data, and crawl webpages. It also provides easy access to general purpose browser agents like OpenAI's CUA, Anthropic's Claude Computer Use, and Browser Use.
More information about the Hyperbrowser can be found here. The hyperbrowser API supports a superset of features present in the mcp server.
More information about the Model Context Protocol can be found here.
To install the server, run:
npx hyperbrowser-mcp <YOUR-HYPERBROWSER-API-KEY>
Add to ~/.cursor/mcp.json
like this:
{
"mcpServers": {
"hyperbrowser": {
"command": "npx",
"args": ["-y", "hyperbrowser-mcp"],
"env": {
"HYPERBROWSER_API_KEY": "YOUR-API-KEY"
}
}
}
}
Add to your ./codeium/windsurf/model_config.json
like this:
{
"mcpServers": {
"hyperbrowser": {
"command": "npx",
"args": ["-y", "hyperbrowser-mcp"],
"env": {
"HYPERBROWSER_API_KEY": "YOUR-API-KEY"
}
}
}
}
For development purposes, you can run the server directly from the source code.
Clone the repository:
git clone git@github.com:hyperbrowserai/mcp.git hyperbrowser-mcp
cd hyperbrowser-mcp
Install dependencies:
npm install # or yarn install
npm run build
Run the server:
node dist/server.js
This is an example config for the Hyperbrowser MCP server for the Claude Desktop client.
{
"mcpServers": {
"hyperbrowser": {
"command": "npx",
"args": ["--yes", "hyperbrowser-mcp"],
"env": {
"HYPERBROWSER_API_KEY": "your-api-key"
}
}
}
}
scrape_webpage
- Extract formatted (markdown, screenshot etc) content from any webpagecrawl_webpages
- Navigate through multiple linked pages and extract LLM-friendly formatted contentextract_structured_data
- Convert messy HTML into structured JSONsearch_with_bing
- Query the web and get results with Bing searchbrowser_use_agent
- Fast, lightweight browser automation with the Browser Use agentopenai_computer_use_agent
- General-purpose automation using OpenAI’s CUA modelclaude_computer_use_agent
- Complex browser tasks using Claude computer usecreate_profile
- Creates a new persistent Hyperbrowser profile.delete_profile
- Deletes an existing persistent Hyperbrowser profile.list_profiles
- Lists existing persistent Hyperbrowser profiles.To install Hyperbrowser MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @hyperbrowserai/mcp --client claude
The server provides the documentation about hyperbrowser through the resources
methods. Any client which can do discovery over resources has access to it.
This project is licensed under the MIT License.
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "hyperbrowser": { "command": "npx", "args": [ "-y", "hyperbrowser-mcp" ], "env": { "API_KEY": "<YOUR_API_KEY>" } } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
A Model Context Protocol server that provides web content fetching capabilities.
by microsoft
Provides fast, lightweight browser automation using Playwright's accessibility tree, enabling LLMs to interact with web pages through structured snapshots instead of screenshots.
by firecrawl
Provides powerful web scraping capabilities for LLM clients such as Cursor, Claude, and others, enabling content extraction, crawling, search, and batch processing through a Model Context Protocol server.
by zcaceres
Fetches web content and returns it in HTML, JSON, plain‑text, or Markdown formats, optionally applying custom request headers.
by tinyfish-io
Extract structured data from any web page by invoking AgentQL's extraction tool through a Model Context Protocol server, enabling AI assistants to retrieve and format web information on demand.
by cyberchitta
Fetches HTML or markdown from bot‑protected websites, delivering plain‑text content that AI assistants can process.
by xxxbrian
Provides realistic browser-like HTTP request capabilities with accurate TLS/JA3/JA4 fingerprints, enabling LLMs to bypass anti-bot measures and retrieve web content, plus conversion of PDF and HTML to Markdown for easier LLM processing.
by djannot
Scrape webpages and convert them to well‑formatted markdown, automatically handling cookies, captchas, paywalls and other interactive elements using AI‑driven vision analysis.
by Dumpling-AI
Provides comprehensive MCP services that integrate Dumpling AI’s data APIs, web scraping, document conversion, AI agents, knowledge‑base management, and secure JavaScript/Python execution.