MCP-Mirror

Zen MCP Server NPX Wrapper

Community MCP-Mirror
Updated

Mirror of https://github.com/199-biotechnologies/mcp-zen-plus

Zen MCP Server NPX Wrapper

Easy-to-use NPX wrapper for Zen MCP Server - Give Claude access to multiple AI models (Gemini, OpenAI, OpenRouter, Ollama) for enhanced development capabilities.

Quick Start

npx zen-mcp-server-199bio

That's it! No Docker required. 🎉

What is Zen MCP Server?

Zen MCP Server gives Claude Desktop access to multiple AI models for:

  • 🧠 Extended reasoning with Gemini 2.0 Pro's thinking mode
  • 💬 Collaborative development with multiple AI perspectives
  • 🔍 Code review and architectural analysis
  • 🐛 Advanced debugging with specialized models
  • 📊 Large context analysis (Gemini: 1M tokens, O3: 200K tokens)
  • 🔄 Conversation threading - AI models maintain context across multiple calls

Features

  • No Docker required - Runs directly with Python
  • 🚀 Fast startup - No container overhead
  • 💾 Lightweight - Minimal resource usage
  • 🔧 Auto-setup - Handles Python dependencies automatically
  • 📦 Virtual environment - Isolated dependencies
  • 🌍 Cross-platform - Works on macOS, Windows, Linux

First Time Setup

On first run, the wrapper will:

  1. Check Python 3.11+ is installed
  2. Clone Zen MCP Server to ~/.zen-mcp-server
  3. Create .env file and prompt for API keys
  4. Set up Python virtual environment
  5. Install dependencies automatically

Quick Install

1. Get API Keys (at least one required)

Choose one or more:

2. Add to Claude Desktop

Add to your claude_desktop_config.json:

{
  "mcpServers": {
    "zen": {
      "command": "npx",
      "args": ["zen-mcp-server-199bio"],
      "env": {
        "GEMINI_API_KEY": "your_gemini_key_here",
        "OPENAI_API_KEY": "your_openai_key_here",
        "OPENROUTER_API_KEY": "your_openrouter_key_here"
      }
    }
  }
}

That's it! Just restart Claude Desktop and you're ready to go.

Location of config file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

Usage with Claude CLI

claude mcp add zen "npx" "zen-mcp-server-199bio"

Available Tools

Once configured, Claude will have access to these tools:

  • zen - Default tool for quick AI consultation (alias for chat)
  • chat - Collaborative development discussions
  • thinkdeep - Extended reasoning (Gemini 2.0 Pro)
  • codereview - Professional code review
  • precommit - Pre-commit validation
  • debug - Advanced debugging assistance
  • analyze - Smart file and codebase analysis

Quick Usage: Just say "use zen" for quick AI consultations!

Troubleshooting

Python not found?

Dependencies issue?

The wrapper tries to install automatically, but if it fails:

cd ~/.zen-mcp-server
python3 -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -r requirements.txt

API key issues?

  • Check ~/.zen-mcp-server/.env has valid keys
  • Ensure at least one API key is configured
  • For OpenRouter, check your credits/limits

Requirements

  • Python 3.11+
  • Node.js >= 14.0.0
  • Git
  • At least one API key (Gemini, OpenAI, or OpenRouter)

Why No Docker?

We removed Docker because:

  • Faster startup - No container overhead
  • Less resource usage - No Redis, no Docker daemon
  • Simpler - Just Python and your API keys
  • Same features - Conversation threading works perfectly with in-memory storage

Links

License

Apache 2.0 - See LICENSE

MCP Server · Populars

MCP Server · New

    render-oss

    Render MCP Server

    The Official Render MCP Server

    Community render-oss
    nhevers

    claude-recall

    Long-term memory layer for Clawd & Claude Code that learns and recalls your project context automatically

    Community nhevers
    VienLi

    lark-tools-mcp

    MCP server provides Feishu related operations to AI encoding agents such as cursor 飞书MCP插件,读取文档、发送消息、合同审批、数据处理.....

    Community VienLi
    joeseesun

    🎯 多源内容 → NotebookLM 智能处理器

    Claude Skill: Multi-source content processor for NotebookLM. Supports WeChat articles, web pages, YouTube, PDF, Markdown, search queries → Podcast/PPT/MindMap/Quiz etc.

    Community joeseesun
    avivsinai

    Langfuse MCP Server

    A Model Context Protocol (MCP) server for Langfuse, enabling AI agents to query Langfuse trace data for enhanced debugging and observability

    Community avivsinai