by suhail-ak-s
Enables AI models to access Typesense search capabilities through the Model Context Protocol, allowing discovery, querying, and analysis of collection data.
Provides a Model Context Protocol (MCP) server that exposes Typesense collections as resources and tools, allowing large language models (LLMs) to list collections, retrieve documents, run searches, and obtain collection statistics.
npm install -g typesense-mcp-server or locally with npm install typesense-mcp-server.npx command that points to the package and supplies connection arguments (--host, --port, --protocol, --api-key).mcpServers section of the client’s configuration file, referencing the command and arguments.typesense_query, typesense_get_document, typesense_collection_stats) and prompts (analyze_collection, search_suggestions) from within the LLM environment.typesense:// URIs with JSON schema metadata.typesense_query – full‑text search with filtering, sorting, and limiting.typesense_get_document – fetch a document by its ID.typesense_collection_stats – retrieve collection statistics and schema.analyze_collection – insight into schema and data distribution.search_suggestions – recommendations for effective query construction./tmp/typesense-mcp.log for troubleshooting.search_suggestions to craft optimal queries for a given dataset.npm install, then npm run build and start with node dist/index.js ….npm run inspector) which exposes a web UI for tracing stdio communication./tmp/typesense-mcp.log.A Model Context Protocol (MCP) server implementation that provides AI models with access to Typesense search capabilities. This server enables LLMs to discover, search, and analyze data stored in Typesense collections.
typesense:// URIstypesense_query
typesense_get_document
typesense_collection_stats
analyze_collection
search_suggestions
# Global installation
npm install -g typesense-mcp-server
# Local installation
npm install typesense-mcp-server
npx @michaellatman/mcp-get@latest install typesense-mcp-server
Install dependencies:
npm install
Build the server:
npm run build
For development with auto-rebuild:
npm run watch
To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"typesense": {
"command": "node",
"args": [
"~/typesense-mcp-server/dist/index.js",
"--host", "your-typesense-host",
"--port", "8108",
"--protocol", "http",
"--api-key", "your-api-key"
]
},
}
}
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
The server provides information about Typesense collections:
typesense://collections/<collection>)
The server provides templates for:
To use this server with the Claude Desktop app, add the following configuration to the "mcpServers" section of your claude_desktop_config.json:
{
"mcpServers": {
"typesense": {
"command": "npx",
"args": [
"-y",
"typesense-mcp-server",
"--host", "your-typesense-host",
"--port", "8108",
"--protocol", "http",
"--api-key", "your-api-key"
]
}
}
}
The server logs information to a file located at:
/tmp/typesense-mcp.log
This log contains detailed information about server operations, requests, and any errors that occur.
This MCP server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.
Please log in to share your review and rating for this MCP.
{
"mcpServers": {
"typesense": {
"command": "npx",
"args": [
"-y",
"typesense-mcp-server",
"--host",
"<HOST>",
"--port",
"<PORT>",
"--protocol",
"<PROTOCOL>"
],
"env": {
"API_KEY": "<YOUR_API_KEY>"
}
}
}
}claude mcp add typesense npx -y typesense-mcp-server --host <HOST> --port <PORT> --protocol <PROTOCOL>Explore related MCPs that share similar capabilities and solve comparable challenges
by exa-labs
Provides real-time web search capabilities to AI assistants via a Model Context Protocol server, enabling safe and controlled access to the Exa AI Search API.
by perplexityai
Enables Claude and other MCP‑compatible applications to perform real‑time web searches through the Perplexity (Sonar) API without leaving the MCP ecosystem.
by MicrosoftDocs
Provides semantic search and fetch capabilities for Microsoft official documentation, returning content in markdown format via a lightweight streamable HTTP transport for AI agents and development tools.
by elastic
Enables natural‑language interaction with Elasticsearch indices via the Model Context Protocol, exposing tools for listing indices, fetching mappings, performing searches, running ES|QL queries, and retrieving shard information.
by graphlit
Enables integration between MCP clients and the Graphlit platform, providing ingestion, extraction, retrieval, and RAG capabilities across a wide range of data sources and connectors.
by mamertofabian
Fast cross‑platform file searching leveraging the Everything SDK on Windows, Spotlight on macOS, and locate/plocate on Linux.
by cr7258
Provides Elasticsearch and OpenSearch interaction via Model Context Protocol, enabling document search, index management, cluster monitoring, and alias operations.
by kagisearch
Provides web search and video summarization capabilities through the Model Context Protocol, enabling AI assistants like Claude to perform queries and summarizations.
by liuyoshio
Provides natural‑language search and recommendation for Model Context Protocol servers, delivering rich metadata and real‑time updates.