by LKbaba
Provides an MCP server that connects Claude Code to Google Gemini 3.1, enabling large‑scale code analysis, real‑time web search, multimodal image queries, and creative brainstorming directly within Claude.
Gemini MCP Server acts as a bridge between Claude Code and Google Gemini 3.1. It exposes Gemini’s 1 M‑token context, Google Search grounding, and multimodal vision capabilities as MCP tools, allowing Claude to delegate specialised tasks such as codebase analysis, web‑search‑backed research, image interpretation, and brainstorming.
~/Library/Application Support/Claude/claude_desktop_config.json%APPDATA%\Claude\claude_desktop_config.json~/.config/Claude/claude_desktop_config.jsonGEMINI_API_KEY in the env section.env. The server auto‑detects this mode.gemini_analyze_codebase, gemini_search, gemini_multimodal_query, etc.gemini-3.1-pro-preview default, gemini-3-flash-preview fast).HTTPS_PROXY.gemini_searchgemini_analyze_codebasegemini_analyze_contentgemini_multimodal_querygemini_brainstormQ: Which authentication method should I use?
A: For quick experiments, set GEMINI_API_KEY. For production or team environments, use a Vertex AI service‑account JSON for better security and quota management.
Q: Do I need to set any additional Google env variables?
A: No. The server auto‑detects service‑account fields from env. Only GEMINI_API_KEY is required for AI‑Studio mode.
Q: How do I change the default model?
A: Include a model parameter in the tool arguments, e.g. { "model": "gemini-3-flash-preview" }.
Q: Can I run the server behind a corporate proxy?
A: Yes. Add HTTPS_PROXY (or HTTP_PROXY) to the env section of the MCP config.
Q: What if both API‑key and service‑account are provided? A: Vertex AI mode takes precedence.
Q: Is the server compatible with other Claude plugins? A: It follows the standard MCP interface, so any Claude client that supports MCP can load it.
Give Claude Code the power of Gemini 3.1
An MCP server that connects Claude Code to Google's Gemini 3.1, unlocking capabilities that complement Claude's strengths.
| Gemini's Strengths | Use Case |
|---|---|
| 1M Token Context | Analyze entire codebases in one shot |
| Google Search Grounding | Get real-time documentation & latest info |
| Multimodal Vision | Understand screenshots, diagrams, designs |
Philosophy: Claude is the commander, Gemini is the specialist.
Add to your MCP config file:
~/Library/Application Support/Claude/claude_desktop_config.json%APPDATA%\Claude\claude_desktop_config.json~/.config/Claude/claude_desktop_config.jsonThen restart Claude Code.
Two authentication modes are supported. The server auto-detects which mode to use based on environment variables.
Best for personal development and quick trials.
{
"mcpServers": {
"gemini": {
"command": "npx",
"args": ["-y", "@lkbaba/mcp-server-gemini"],
"env": {
"GEMINI_API_KEY": "your-api-key"
}
}
}
}
More secure, uses Google Cloud IAM authentication.
Prerequisites:
Setup (2 minutes):
env section of your MCP config:{
"mcpServers": {
"gemini": {
"command": "npx",
"args": ["-y", "@lkbaba/mcp-server-gemini"],
"env": {
"type": "service_account",
"project_id": "your-project-id",
"private_key_id": "key-id-here",
"private_key": "-----BEGIN PRIVATE KEY-----\nMIIEv...\n-----END PRIVATE KEY-----\n",
"client_email": "your-sa@your-project.iam.gserviceaccount.com",
"client_id": "123456789",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/your-sa%40your-project.iam.gserviceaccount.com",
"universe_domain": "googleapis.com"
}
}
}
}
The server auto-detects service account credentials from env vars — no GOOGLE_GENAI_USE_VERTEXAI or GOOGLE_CLOUD_PROJECT needed. Just paste and go.
Tip: On Windows, the server automatically fixes slash corruption (
/→\) in PEM private keys that some MCP clients introduce.
Advanced options: You can also use
GOOGLE_GENAI_USE_VERTEXAI=true+GOOGLE_CREDENTIALS_JSON,GOOGLE_APPLICATION_CREDENTIALS(file path), orgcloud auth application-default login. See the environment variables reference below.
Paste JSON approach (Option 2 above — simplest for Vertex AI):
Just paste the service account JSON fields directly into env. No extra variables needed — the server auto-detects type: "service_account".
Explicit Vertex AI mode (advanced):
| Variable | Required | Description |
|---|---|---|
GOOGLE_GENAI_USE_VERTEXAI |
Yes | Set to "true" to enable |
GOOGLE_CLOUD_PROJECT |
Yes | GCP project ID |
GOOGLE_CLOUD_LOCATION |
No | Region (default: global) |
GOOGLE_CREDENTIALS_JSON |
No* | Entire service account JSON as a single string |
GOOGLE_APPLICATION_CREDENTIALS |
No* | File path to service account JSON key |
* At least one credential source is needed: GOOGLE_CREDENTIALS_JSON, GOOGLE_APPLICATION_CREDENTIALS, or gcloud ADC.
AI Studio mode:
| Variable | Required | Description |
|---|---|---|
GEMINI_API_KEY |
Yes | API key from Google AI Studio |
If both modes are configured, Vertex AI takes priority.
gemini-3.1-pro-preview| Tool | Description |
|---|---|
gemini_search |
Web search with Google Search grounding. Get real-time info, latest docs, current events. |
| Tool | Description |
|---|---|
gemini_analyze_codebase |
Analyze entire projects with 1M token context. Supports directory path, file paths, or direct content. |
gemini_analyze_content |
Analyze code, documents, or data. Supports file path or direct content input. |
| Tool | Description |
|---|---|
gemini_multimodal_query |
Analyze images with natural language. Understand designs, diagrams, screenshots. |
| Tool | Description |
|---|---|
gemini_brainstorm |
Generate creative ideas with project context. Supports reading README, PRD files. |
All tools now support an optional model parameter:
| Model | Speed | Best For |
|---|---|---|
gemini-3.1-pro-preview |
Standard | Complex analysis, deep reasoning, agentic workflows (default) |
gemini-3-flash-preview |
Fast | Simple tasks, quick responses, search queries |
Note: gemini-3-pro-preview is deprecated (retired 2026-03-09) and will be automatically mapped to gemini-3.1-pro-preview.
Example: Use the new default model
{
"name": "gemini_analyze_content",
"arguments": {
"filePath": "./src/index.ts",
"task": "review",
"model": "gemini-3.1-pro-preview"
}
}
"Use Gemini to analyze the ./src directory for architectural patterns and potential issues"
"Search for the latest Next.js 15 App Router documentation"
"Analyze this architecture diagram and explain the data flow" (attach image)
"Brainstorm feature ideas based on this project's README.md"
Add proxy environment variable to your config:
{
"mcpServers": {
"gemini": {
"command": "npx",
"args": ["-y", "@lkbaba/mcp-server-gemini"],
"env": {
"GEMINI_API_KEY": "your_api_key_here",
"HTTPS_PROXY": "http://127.0.0.1:7897"
}
}
}
}
git clone https://github.com/LKbaba/Gemini-mcp.git
cd Gemini-mcp
npm install
npm run build
export GEMINI_API_KEY="your_api_key_here"
npm start
src/
├── config/
│ ├── models.ts # Model configurations
│ └── constants.ts # Global constants
├── tools/
│ ├── definitions.ts # MCP tool definitions
│ ├── multimodal-query.ts # Multimodal queries
│ ├── analyze-content.ts # Content analysis
│ ├── analyze-codebase.ts # Codebase analysis
│ ├── brainstorm.ts # Brainstorming
│ └── search.ts # Web search
├── utils/
│ ├── gemini-factory.ts # Dual-mode auth factory (API Key + Vertex AI)
│ ├── gemini-client.ts # Gemini API client
│ ├── file-reader.ts # File system access
│ ├── security.ts # Path validation
│ ├── validators.ts # Parameter validation
│ └── error-handler.ts # Error handling
├── types.ts # Type definitions
└── server.ts # Main server
Based on aliargun/mcp-server-gemini
MIT
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.
{
"mcpServers": {
"gemini": {
"command": "npx",
"args": [
"-y",
"@lkbaba/mcp-server-gemini"
],
"env": {
"GEMINI_API_KEY": "<YOUR_API_KEY>"
}
}
}
}claude mcp add gemini npx -y @lkbaba/mcp-server-gemini