by builtwith
Enables AI assistants to retrieve detailed technology stack information for any website through the BuiltWith API.
Provides a Model Context Protocol (MCP) server that bridges AI assistants with BuiltWith’s technology detection API, allowing natural‑language queries about a site’s frameworks, analytics tools, hosting providers, and more.
git clone https://github.com/builtwith/mcp.git
cd mcp
npm install
BUILTWITH_API_KEY
in the MCP configuration JSON (example paths for Claude Desktop and VS Code are documented).Q: Do I need a BuiltWith account? A: Yes, an API key from BuiltWith is required; obtain it from the BuiltWith developer portal.
Q: Which programming languages can I use with this server? A: The server itself runs on Node.js, but any MCP‑compatible AI tool (regardless of language) can communicate with it.
Q: How is the API key secured?
A: It is injected via an environment variable (BUILTWITH_API_KEY
) and not hard‑coded in the source.
Q: Can I run the server locally?
A: Absolutely – after npm install
you can start it on your machine and point your AI assistant to the local endpoint.
Q: Is there a limit on the number of domain queries? A: Limits are governed by your BuiltWith subscription plan, not by this MCP server.
A Model Context Protocol (MCP) server that integrates with BuiltWith's technology detection API. This server allows AI assistants to identify the technology stack behind any website, providing detailed information about frameworks, analytics tools, hosting services, and more - all through natural language commands.
# Clone the repository
git clone https://github.com/builtwith/mcp.git
# Navigate to directory
cd mcp
# Install dependencies
npm install
The BuiltWith MCP Server requires an API key from BuiltWith. Configure the server with your API key as follows:
{
"mcpServers": {
"builtwith": {
"command": "node",
"args": ["[PATH-TO]/bw-mcp-v1.js"],
"env": {
"BUILTWITH_API_KEY": "[YOUR-API-KEY]"
}
}
}
}
~/Library/Application Support/Claude/claude_desktop_config.json
(macOS) or %APPDATA%\Claude\claude_desktop_config.json
(Windows)~/Library/Application Support/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json
(macOS) or %APPDATA%\Code\User\globalStorage\saoudrizwan.claude-dev\settings\cline_mcp_settings.json
(Windows)Once configured, you can use the BuiltWith MCP Server with any MCP-compatible AI assistant. Here are some examples of what you can ask:
The BuiltWith MCP Server acts as a bridge between AI assistants and the BuiltWith API:
For more information about the BuiltWith API, visit:
This project is licensed under the MIT License - see the LICENSE file for details.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.