by antfu
Provides MCP support for Vite and Nuxt applications, enabling models to better understand app structure and behavior.
Adds Model Context Protocol (MCP) capabilities to Vite and Nuxt projects, allowing AI models to gain deeper insight into the codebase, routes, and runtime environment of your application.
npm i -D nuxt-mcpnpm i -D vite-plugin-mcpnuxt-mcp to the modules array in nuxt.config.ts.vite-plugin-mcp() in the plugins array of vite.config.ts.Q: Is Nuxt Mcp production‑ready? A: It is currently marked as experimental; use it with caution in critical environments.
Q: Can I use both the Nuxt module and the Vite plugin in the same monorepo? A: Yes, install each package where appropriate; they operate independently.
Q: Which AI models benefit from MCP support? A: Any model that consumes project context—such as code completion, documentation, or analysis tools—can leverage the protocol.
Q: Where can I find more documentation? A: The badge links in the README point to npm, bundlephobia, and JSDocs for deeper reference.
MCP server helping models to understand your Vite/Nuxt app better.
This monorepo contains two packages:
nuxt-mcp-dev - A Nuxt module for adding MCP support to your Nuxt dev server.vite-plugin-mcp - A Vite plugin for adding MCP support to your Vite app.[!IMPORTANT] Experimental. Use with caution.
MIT License © Anthony Fu
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.