Ask Gemini MCP
An MCP server for AI-to-AI collaboration via the Gemini CLI. Works with Claude Code, Claude Desktop, Cursor, Warp, Copilot, and 40+ other MCP clients. Leverage Gemini's massive 1M+ token context window for large file and codebase analysis while your primary AI handles interaction and code editing.
Why?
- Get a second opinion — Ask Gemini to review your coding approach before committing to it
- Debate plans — Send architecture proposals to Gemini for critique and alternative suggestions
- Review changes — Have Gemini analyze diffs or modified files to catch issues your primary AI might miss
- Massive context — Gemini reads entire codebases (1M+ tokens) that would overflow other models
Quick Start
Claude Code
claude mcp add gemini-cli -- npx -y ask-gemini-mcp
Claude Desktop
Add to your config file (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):
{
"mcpServers": {
"gemini-cli": {
"command": "npx",
"args": ["-y", "ask-gemini-mcp"]
}
}
}
Other config file locations
- Windows:
%APPDATA%\Claude\claude_desktop_config.json - Linux:
~/.config/claude/claude_desktop_config.json
Any MCP Client (STDIO Transport)
{
"transport": {
"type": "stdio",
"command": "npx",
"args": ["-y", "ask-gemini-mcp"]
}
}
Prerequisites
- Node.js v20.0.0 or higher (LTS)
- Google Gemini CLI installed and authenticated
Tools
| Tool | Purpose |
|---|---|
ask-gemini |
Send prompts to Gemini CLI. Supports @ file syntax, model selection, sandbox mode, and changeMode for structured edits |
fetch-chunk |
Retrieve subsequent chunks from cached large responses |
ping |
Connection test — verify MCP setup without using Gemini tokens |
Usage Examples
File analysis (@ syntax):
ask gemini to analyze @src/main.js and explain what it doesuse gemini to summarize @. the current directory
Code review:
ask gemini to review the changes in @src/auth.ts for security issuesuse gemini to compare @old.js and @new.js
General questions:
ask gemini about best practices for React state management
Sandbox mode:
use gemini sandbox to create and run a Python script
Models
| Model | Use Case |
|---|---|
gemini-3.1-pro-preview |
Default — best quality reasoning |
gemini-3-flash-preview |
Faster responses, large codebases |
The server automatically falls back to Flash when Pro quota is exceeded.
Contributing
Contributions are welcome! See open issues for things to work on.
License
MIT License. See LICENSE for details.
Disclaimer: This is an unofficial, third-party tool and is not affiliated with, endorsed, or sponsored by Google.