๐Ÿš€ MCP Agent & Server Ecosystem

A state-of-the-art demonstration of the Model Context Protocol (MCP), featuring autonomous agents, browser automation, and multi-server orchestration. This ecosystem leverage's Groq's high-performance inference to provide a seamless agentic experience.

๐Ÿ—๏ธ Architecture Overview

The system operates in two distinct modes. Below are the precise technical architectures for both the interactive CLI and the standalone MCP Server mode.

๐ŸŽฏ 1. MCP Architecture - Direct CLI Mode (app.py)

In this mode, the user interacts directly with the terminal-based agent which handles reasoning and tool execution in a single host process.

graph TD
    %% Color Definitions
    classDef blue fill:#3498db,stroke:#333,stroke-width:2px,color:#fff
    classDef lightblue fill:#87ceeb,stroke:#333,stroke-width:2px,color:#000
    classDef purple fill:#9b59b6,stroke:#333,stroke-width:2px,color:#fff
    classDef yellow fill:#f1c40f,stroke:#333,stroke-width:2px,color:#000
    classDef orange fill:#e67e22,stroke:#333,stroke-width:2px,color:#fff
    classDef red fill:#e74c3c,stroke:#333,stroke-width:2px,color:#fff
    classDef green fill:#2ecc71,stroke:#333,stroke-width:2px,color:#fff

    User["๐Ÿ‘ค User"]:::blue
    App["๐Ÿ–ฅ๏ธ app.py (CLI Interface)"]:::lightblue
    
    subgraph Host ["๐ŸŸฆ MCP Host (Application Space)"]
        direction TB
        subgraph AgentBox ["๐ŸŸช MCPAgent (Decision Maker)"]
            Agent["๐Ÿค– MCPAgent<br/>(Decision Maker: LLM + Planning)"]:::purple
            LLM["๐Ÿง  LLM (Groq - Llama 3.3)<br/>(Reasoning / Decision Making)"]:::purple
            Client["๐Ÿ”Œ MCPClient<br/>(Tool Execution Layer / Connector)"]:::yellow
            Agent --- LLM
            Agent --- Client
        end
    end

    Config["๐Ÿ“„ browser_mcp.json (Registry)"]:::orange
    
    subgraph ServersBox ["๐ŸŸฅ MCP Servers (Tool Providers)"]
        PW["๐ŸŒ Playwright MCP Server"]:::red
        AB["๐Ÿ  Airbnb MCP Server"]:::red
        DDG["๐Ÿ” DuckDuckGo MCP Server"]:::red
    end

    subgraph ToolsBox ["๐ŸŸฉ Tools Layer"]
        Tools["browser_navigate<br/>browser_click<br/>duckduckgo_search<br/>airbnb_search"]:::green
    end

    %% Logical Connections
    User --> App
    App --> Agent
    Client -->|reads config| Config
    Client --> PW
    Client --> AB
    Client --> DDG
    PW --> Tools
    AB --> Tools
    DDG --> Tools

๐ŸŽฏ 2. MCP Architecture - Server Mode (server.py)

In this mode, the project acts as an MCP server itself, exposing its capabilities to external clients like VS Code Copilot.

graph TD
    %% Color Definitions
    classDef blue fill:#3498db,stroke:#333,stroke-width:2px,color:#fff
    classDef lightblue fill:#87ceeb,stroke:#333,stroke-width:2px,color:#000
    classDef purple fill:#9b59b6,stroke:#333,stroke-width:2px,color:#fff
    classDef yellow fill:#f1c40f,stroke:#333,stroke-width:2px,color:#000
    classDef orange fill:#e67e22,stroke:#333,stroke-width:2px,color:#fff
    classDef red fill:#e74c3c,stroke:#333,stroke-width:2px,color:#fff
    classDef green fill:#2ecc71,stroke:#333,stroke-width:2px,color:#fff

    Ext["๐ŸŒ External Client<br/>(Caller of MCP Server)"]:::blue
    
    subgraph ServerHost ["๐ŸŸฆ MCP Host (server.py)"]
        Server["โš™๏ธ server.py (FastMCP Server)"]:::lightblue
        Task["๐Ÿ› ๏ธ run_task(query)"]:::lightblue
        
        subgraph AgentBoxServer ["๐ŸŸช MCPAgent"]
            AgentS["๐Ÿค– MCPAgent<br/>(Decision Maker)"]:::purple
            LLMS["๐Ÿง  Groq LLM"]:::purple
            ClientS["๐Ÿ”Œ MCPClient<br/>(Tool Connector)"]:::yellow
            AgentS --- LLMS
            AgentS --- ClientS
        end
    end

    ConfigS["๐Ÿ“„ browser_mcp.json"]:::orange
    
    subgraph ServersBoxServer ["๐ŸŸฅ MCP Servers (Tool Providers)"]
        PWS["๐ŸŒ Playwright"]:::red
        ABS["๐Ÿ  Airbnb"]:::red
    end

    subgraph ToolsBoxServer ["๐ŸŸฉ Tools"]
        TS["browser, search, etc."]:::green
    end

    %% Logical Connections
    Ext -->|Calls run_task| Server
    Server --> Task
    Task --> AgentS
    ClientS -->|reads config| ConfigS
    ClientS --> PWS
    ClientS --> ABS
    PWS --> TS
    ABS --> TS

โœจ Key Features

  • โšก High-Performance Inference: Powered by Groq's llama-3.3-70b-versatile for near-instantaneous reasoning.
  • ๐ŸŒ Autonomous Browser Control: Deep integration with Playwright for navigating and interacting with the web.
  • ๐Ÿ”Œ Flexible Server Protocol: Connects to any standard MCP server for extensible tool capabilities.
  • ๐Ÿ“‚ State-Aware Memory: (In app.py) Maintains conversation state to handle complex, iterative requests.
  • ๐Ÿ› ๏ธ Custom Server Extension: Includes its own FastMCP server for wrapping agentic workflows as reusable tools.

๐Ÿ“‚ Project Structure

Component Responsibility
app.py The flagship CLI chat interface and agent controller.
server.py A FastMCP server implementation providing the run_task tool.
browser_mcp.json The core registry for all connected MCP services.
pyproject.toml Project dependencies managed via Python's uv tool.
.env Secure storage for sensitive API keys.

๐Ÿ› ๏ธ Getting Started

1. Environment Setup

Ensure you have uv installed and a valid Groq API key.

# Clone the environment variables
echo "GROQ_API_KEY=your_key_here" > .env

2. Launch the Ecosystem

You can interact with the agent directly or run the custom server.

Start the Interactive Agent:

python app.py

Expose the Custom MCP Server:

python server.py

๐Ÿ“– Implementation Notes

The ecosystem is built on the mcp_use library, bridging LangChain components with the Model Context Protocol. The MCPAgent is configured with safety rails like max_steps to prevent infinite loops during autonomous execution.

๐Ÿ”ฅ MCP enables a single agent to interact with multiple tool providers via standardized servers.

Note: The previous mcp.json was detected as missing or redundant; all core configuration is now consolidated in browser_mcp.json.

Made with โค๏ธ for the MCP Community

MCP Server ยท Populars

MCP Server ยท New

    ogham-mcp

    Ogham MCP

    Shared memory MCP server โ€” persistent, searchable, cross-client

    Community ogham-mcp
    rocketride-org

    rocketride-server

    High-performance AI pipeline engine with a C++ core and 50+ Python-extensible nodes. Build, debug, and scale LLM workflows with 13+ model providers, 8+ vector databases, and agent orchestration, all from your IDE. Includes VS Code extension, TypeScript/Python SDKs, and Docker deployment.

    Community rocketride-org
    nteract

    semiotic

    A data visualization for AI and Streaming

    Community nteract
    louislva

    claude-peers

    Allow all your Claude Codes to message each other ad-hoc!

    Community louislva
    rixinhahaha

    Snip

    A macOS menu-bar screenshot tool with annotation, AI-powered organization, and semantic search. Built with Electron and Ollama. Featured on Product Hunt: https://www.producthunt.com/products/snip-ai-powered-macos-screenshot-tool

    Community rixinhahaha