by vectara
Provides fast, reliable retrieval‑augmented generation and semantic search via the Model Context Protocol, allowing agents to query Vectara corpora and receive generated answers together with source passages.
Vectara MCP exposes two primary tools – ask_vectara and search_vectara – that let AI agents perform RAG queries or plain semantic searches against Vectara indexes. The server implements the Model Context Protocol (MCP), so any MCP‑compatible client (e.g., Claude Desktop) can call these tools without additional glue code.
pip install vectara-mcp
.uv tool run vectara-mcp
).claude_desktop_config.json
:{
"mcpServers": {
"Vectara": {
"command": "uv",
"args": ["tool", "run", "vectara-mcp"]
}
}
}
Q: Do I need a Vectara account? A: Yes, you must have a Vectara API key and at least one corpus key to use the tools.
Q: Can I run multiple corpora simultaneously?
A: Provide a list of corpus keys in the corpus_keys
argument; the server will search across all supplied corpora.
Q: What language models are used for generation?
A: The default generation preset (vectara-summary-table-md-query-ext-jan-2025-gpt-4o
) leverages a Vectara‑hosted LLM; you can specify a different preset via the generation_preset_name
argument.
Q: How do I change the number of context sentences?
A: Use the optional n_sentences_before
and n_sentences_after
parameters when calling the tool.
Q: Is there a limit on the number of returned search results?
A: The max_used_search_results
argument caps the number of results used for generation (default 10).
🔌 Compatible with Claude Desktop, and any other MCP Client!
Vectara MCP is also compatible with any MCP client
The Model Context Protocol (MCP) is an open standard that enables AI systems to interact seamlessly with various data sources and tools, facilitating secure, two-way connections.
Vectara-MCP provides any agentic application with access to fast, reliable RAG with reduced hallucination, powered by Vectara's Trusted RAG platform, through the MCP protocol.
You can install the package directly from PyPI:
pip install vectara-mcp
ask_vectara: Run a RAG query using Vectara, returning search results with a generated response.
Args:
Returns:
search_vectara: Run a semantic search query using Vectara, without generation.
Args:
Returns:
Add to your claude_desktop_config.json:
{
"mcpServers": {
"Vectara": {
"command": "uv",
"args": [
"tool",
"run",
"vectara-mcp"
]
}
}
}
Once the installation is complete, and the Claude desktop app is configured, you must completely close and re-open the Claude desktop app to see the Vectara-mcp server. You should see a hammer icon in the bottom left of the app, indicating available MCP tools, you can click on the hammer icon to see more detial on the Vectara-search and Vectara-extract tools.
Now claude will have complete access to the Vectara-mcp server, including the ask-vectara and search-vectara tools. When you issue the tools for the first time, Claude will ask you for your Vectara api key and corpus key (or keys if you want to use multiple corpora). After you set those, you will be ready to go. Here are some examples you can try (with the Vectara corpus that includes information from our website:
ask-vectara Who is Amr Awadallah?
search-vectara events in NYC?
Please log in to share your review and rating for this MCP.
{ "mcpServers": { "Vectara": { "command": "uv", "args": [ "tool", "run", "vectara-mcp" ] } } }
Explore related MCPs that share similar capabilities and solve comparable challenges
by exa-labs
Provides real-time web search capabilities to AI assistants via a Model Context Protocol server, enabling safe and controlled access to the Exa AI Search API.
by elastic
Enables natural‑language interaction with Elasticsearch indices via the Model Context Protocol, exposing tools for listing indices, fetching mappings, performing searches, running ES|QL queries, and retrieving shard information.
by graphlit
Enables integration between MCP clients and the Graphlit platform, providing ingestion, extraction, retrieval, and RAG capabilities across a wide range of data sources and connectors.
by mamertofabian
Fast cross‑platform file searching leveraging the Everything SDK on Windows, Spotlight on macOS, and locate/plocate on Linux.
by cr7258
Provides Elasticsearch and OpenSearch interaction via Model Context Protocol, enabling document search, index management, cluster monitoring, and alias operations.
by liuyoshio
Provides natural‑language search and recommendation for Model Context Protocol servers, delivering rich metadata and real‑time updates.
by ihor-sokoliuk
Provides web search capabilities via the SearXNG API, exposing them through an MCP server for seamless integration with AI agents and tools.
by fatwang2
Provides web and news search, URL crawling, sitemap extraction, deep‑reasoning, and trending topic retrieval via Search1API, exposed as an MCP server for integration with AI clients.
by cnych
Provides SEO data retrieval via Ahrefs, exposing MCP tools for backlink analysis, keyword generation, traffic estimation, and keyword difficulty, with automated CAPTCHA solving and response caching.