by Nekzus
Provides AI‑enhanced analysis of NPM packages, delivering version history, dependency mapping, security vulnerability scanning, size metrics, and quality scores through the Model Context Protocol.
Npm Sentinel delivers real‑time, AI‑driven intelligence for NPM packages. It exposes a set of MCP tools that can fetch versions, dependencies, TypeScript support, package size, security advisories, download trends, quality scores and many other metrics, allowing developers to make faster, safer package‑management decisions.
{
"mcpServers": {
"npm-sentinel": {
"command": "npx",
"args": ["-y", "@nekzus/mcp-server@latest"]
}
}
}
Add the above snippet to your VS Code settings.json under mcpServers or to a Claude Desktop config file.docker build -t nekzus/npm-sentinel-mcp .) and run it with the provided mount arguments.https://smithery.ai/server/@Nekzus/npm-sentinel-mcp in the MCP configuration.npm run dev for a playground, npm run build:http then npm run start:http for the HTTP server.Q: Do I need to modify my existing npm workflow?
A: No. The server works as an MCP side‑car; your normal npm install commands remain unchanged.
Q: Can I run the server locally without Docker?
A: Yes. Use npx @nekzus/mcp-server@latest (STDIO) or start the HTTP variant with npm run start:http.
Q: Is an API key required? A: The core functionality does not require an API key, but some optional AI back‑ends (Claude, Anthropic) may need credentials set in the environment.
Q: How does caching work? A: Frequently requested package metadata is cached in memory, reducing external registry calls and respecting rate limits.
Q: What languages are supported? A: The server is written in TypeScript and can be invoked from any MCP‑compatible client across languages.
A powerful Model Context Protocol (MCP) server that revolutionizes NPM package analysis through AI. Built to integrate with Claude and Anthropic AI, it provides real-time intelligence on package security, dependencies, and performance. This MCP server delivers instant insights and smart analysis to safeguard and optimize your npm ecosystem, making package management decisions faster and safer for modern development workflows.
Note: The server provides AI-assisted analysis through MCP integration.
This MCP server now supports both STDIO and HTTP streamable transport. Your existing STDIO configuration will continue to work without changes.
New capabilities:
Development commands:
# Development server with playground
npm run dev
# Build for HTTP
npm run build:http
# Start HTTP server
npm run start:http
Add this to your VS Code MCP config file. See VS Code MCP docs for more info.
{
"servers": {
"npm-sentinel": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@nekzus/mcp-server@latest"]
}
}
}
This MCP server now supports HTTP streamable transport through Smithery.ai for enhanced scalability and performance. You can deploy it directly on Smithery.ai: Benefits of HTTP deployment:
Configuration for Smithery.ai:
{
"mcpServers": {
"npm-sentinel": {
"type": "http",
"url": "https://smithery.ai/server/@Nekzus/npm-sentinel-mcp"
}
}
}
# Build the Docker image
docker build -t nekzus/npm-sentinel-mcp .
You can run the MCP server using Docker with directory mounting to /projects:
{
"mcpServers": {
"npm-sentinel-mcp": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-w", "/projects",
"--mount", "type=bind,src=${PWD},dst=/projects",
"nekzus/npm-sentinel-mcp",
"node",
"dist/index.js"
]
}
}
}
For multiple directories:
{
"mcpServers": {
"npm-sentinel-mcp": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-w", "/projects",
"--mount", "type=bind,src=/path/to/workspace,dst=/projects/workspace",
"--mount", "type=bind,src=/path/to/other/dir,dst=/projects/other/dir,ro",
"nekzus/npm-sentinel-mcp",
"node",
"dist/index.js"
]
}
}
}
Note: All mounted directories must be under /projects for proper access.
Add this to your claude_desktop_config.json:
{
"mcpServers": {
"npmsentinel": {
"command": "npx",
"args": ["-y", "@nekzus/mcp-server@latest"]
}
}
}
Configuration file locations:
%APPDATA%\Claude\claude_desktop_config.json~/Library/Application Support/Claude/claude_desktop_config.json{
"mcpServers": {
"npm-sentinel-mcp": {
"command": "npx",
"args": [
"-y",
"@nekzus/mcp-server@latest"
]
}
}
}
The server exposes its tools via the Model Context Protocol. All tools adhere to a standardized response format:
{
"content": [
{
"type": "text",
"text": "string",
"isError": boolean // Optional
}
// ... more content items if necessary
]
}
npm://registry: NPM Registry interfacenpm://security: Security analysis interfacenpm://metrics: Package metrics interfaceThe server also provides the following informational resources accessible via MCP GetResource requests:
doc://server/readme:
README.md file content for this NPM Sentinel MCP server.text/markdowndoc://mcp/specification:
llms-full.txt content, providing the comprehensive Model Context Protocol specification.text/plainpackages (string[])packages (string[])packages (string[])packages (string[])packages (string[])packages (string[])packages (string[])period ("last-week" | "last-month" | "last-year")packages (string[])packages (string[])packages (string[])packages (string[])query (string)limit (number, optional)packages (string[])packages (string[])packages (string[])packages (string[])packages (string[])packages (string[])packages (string[])# Install dependencies
npm install
# Build for STDIO (traditional)
npm run build:stdio
# Build for HTTP (Smithery)
npm run build:http
# Development server
npm run dev
This MCP server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.
MIT © nekzus
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
{
"mcpServers": {
"npm-sentinel": {
"command": "npx",
"args": [
"-y",
"@nekzus/mcp-server@latest"
],
"env": {}
}
}
}claude mcp add npm-sentinel npx -y @nekzus/mcp-server@latest