by ubie-oss
Search documents using Vertex AI with Gemini grounding, allowing private data to improve relevance of results.
A server that leverages Vertex AI Gemini with grounding to search across one or multiple Vertex AI Datastores, returning results that are contextually anchored in your private data.
git clone https://github.com/ubie-oss/mcp-vertexai-search.git
cd mcp-vertexai-search
uv venv
uv sync --all-extras
uv run mcp-vertexai-search
pip install git+https://github.com/ubie-oss/mcp-vertexai-search.git
mcp-vertexai-search --help
config.yml.template
to config.yml
.uv run mcp-vertexai-search serve \
--config config.yml \
--transport <stdio|sse>
uv run mcp-vertexai-search search \
--config config.yml \
--query "your query"
Q: Do I need a Google Cloud account? A: Yes, you need access to Vertex AI models and Datastores within a GCP project.
Q: Is the package available on PyPI? A: Not yet; install directly from the GitHub repository.
Q: Which model should I use?
A: Any Gemini model that supports the generate‑content API; specify the model name in config.yml
.
Q: Can I run this in production? A: Yes, using the provided Dockerfile or by deploying the server with your preferred orchestration tool.
Q: How do I add multiple Datastores?
A: List each datastore under the data_stores
section in config.yml
with its project ID, location, ID, tool name, and description.
This is a MCP server to search documents using Vertex AI.
This solution uses Gemini with Vertex AI grounding to search documents using your private data. Grounding improves the quality of search results by grounding Gemini's responses in your data stored in Vertex AI Datastore. We can integrate one or multiple Vertex AI data stores to the MCP server. For more details on grounding, refer to Vertex AI Grounding Documentation.
There are two ways to use this MCP server. If you want to run this on Docker, the first approach would be good as Dockerfile is provided in the project.
# Clone the repository
git clone git@github.com:ubie-oss/mcp-vertexai-search.git
# Create a virtual environment
uv venv
# Install the dependencies
uv sync --all-extras
# Check the command
uv run mcp-vertexai-search
The package isn't published to PyPI yet, but we can install it from the repository. We need a config file derives from config.yml.template to run the MCP server, because the python package doesn't include the config template. Please refer to Appendix A: Config file for the details of the config file.
# Install the package
pip install git+https://github.com/ubie-oss/mcp-vertexai-search.git
# Check the command
mcp-vertexai-search --help
# Optional: Install uv
python -m pip install -r requirements.setup.txt
# Create a virtual environment
uv venv
uv sync --all-extras
This supports two transports for SSE (Server-Sent Events) and stdio (Standard Input Output).
We can control the transport by setting the --transport
flag.
We can configure the MCP server with a YAML file. config.yml.template is a template for the config file. Please modify the config file to fit your needs.
uv run mcp-vertexai-search serve \
--config config.yml \
--transport <stdio|sse>
We can test the Vertex AI Search by using the mcp-vertexai-search search
command without the MCP server.
uv run mcp-vertexai-search search \
--config config.yml \
--query <your-query>
config.yml.template is a template for the config file.
server
server.name
: The name of the MCP servermodel
model.model_name
: The name of the Vertex AI modelmodel.project_id
: The project ID of the Vertex AI modelmodel.location
: The location of the model (e.g. us-central1)model.impersonate_service_account
: The service account to impersonatemodel.generate_content_config
: The configuration for the generate content APIdata_stores
: The list of Vertex AI data stores
data_stores.project_id
: The project ID of the Vertex AI data storedata_stores.location
: The location of the Vertex AI data store (e.g. us)data_stores.datastore_id
: The ID of the Vertex AI data storedata_stores.tool_name
: The name of the tooldata_stores.description
: The description of the Vertex AI data storePlease log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by exa-labs
Provides real-time web search capabilities to AI assistants via a Model Context Protocol server, enabling safe and controlled access to the Exa AI Search API.
by elastic
Enables natural‑language interaction with Elasticsearch indices via the Model Context Protocol, exposing tools for listing indices, fetching mappings, performing searches, running ES|QL queries, and retrieving shard information.
by graphlit
Enables integration between MCP clients and the Graphlit platform, providing ingestion, extraction, retrieval, and RAG capabilities across a wide range of data sources and connectors.
by mamertofabian
Fast cross‑platform file searching leveraging the Everything SDK on Windows, Spotlight on macOS, and locate/plocate on Linux.
by cr7258
Provides Elasticsearch and OpenSearch interaction via Model Context Protocol, enabling document search, index management, cluster monitoring, and alias operations.
by liuyoshio
Provides natural‑language search and recommendation for Model Context Protocol servers, delivering rich metadata and real‑time updates.
by ihor-sokoliuk
Provides web search capabilities via the SearXNG API, exposing them through an MCP server for seamless integration with AI agents and tools.
by fatwang2
Provides web and news search, URL crawling, sitemap extraction, deep‑reasoning, and trending topic retrieval via Search1API, exposed as an MCP server for integration with AI clients.
by cnych
Provides SEO data retrieval via Ahrefs, exposing MCP tools for backlink analysis, keyword generation, traffic estimation, and keyword difficulty, with automated CAPTCHA solving and response caching.