automateyournetwork

chatgpt_mcp

Updated

An MCP Server for chatGPT Chat Completions

🧠 Ask ChatGPT - MCP Server (Stdio)

This is a Model Context Protocol (MCP) stdio server that forwards prompts to OpenAI’s ChatGPT (GPT-4o). It is designed to run inside LangGraph-based assistants and enables advanced summarization, analysis, and reasoning by accessing an external LLM.

πŸ“Œ What It Does

This server exposes a single tool:

{
  "name": "ask_chatgpt",
  "description": "Sends the provided text ('content') to an external ChatGPT (gpt-4o) model for advanced reasoning or summarization.",
  "parameters": {
    "type": "object",
    "properties": {
      "content": {
        "type": "string",
        "description": "The text to analyze, summarize, compare, or reason about."
      }
    },
    "required": ["content"]
  }
}

Use this when your assistant needs to:

Summarize long documents

Analyze configuration files

Compare options

Perform advanced natural language reasoning

🐳 Docker Usage

Build and run the container:


docker build -t ask-chatgpt-mcp .

docker run -e OPENAI_API_KEY=your-openai-key -i ask-chatgpt-mcp

πŸ§ͺ Manual Test

Test the server locally using a one-shot request:


echo '{"method":"tools/call","params":{"name":"ask_chatgpt","arguments":{"content":"Summarize this config..."}}}' | \
  OPENAI_API_KEY=your-openai-key python3 server.py --oneshot

🧩 LangGraph Integration

To connect this MCP server to your LangGraph pipeline, configure it like this:


("chatgpt-mcp", ["python3", "server.py", "--oneshot"], "tools/discover", "tools/call")

βš™οΈ MCP Server Config Example

Here’s how to configure the server using an mcpServers JSON config:


{
  "mcpServers": {
    "chatgpt": {
      "command": "python3",
      "args": [
        "server.py",
        "--oneshot"
      ],
      "env": {
        "OPENAI_API_KEY": "<YOUR_OPENAI_API_KEY>"
      }
    }
  }
}

πŸ” Explanation

"command": Runs the script with Python

"args": Enables one-shot stdin/stdout mode

"env": Injects your OpenAI key securely

🌍 Environment Setup

Create a .env file (auto-loaded with python-dotenv) or export the key manually:


OPENAI_API_KEY=your-openai-key

Or:


export OPENAI_API_KEY=your-openai-key

πŸ“¦ Dependencies

Installed during the Docker build:

openai

requests

python-dotenv

πŸ“ Project Structure

.
β”œβ”€β”€ Dockerfile        # Docker build for the MCP server
β”œβ”€β”€ server.py         # Main stdio server implementation
└── README.md         # You're reading it!

πŸ” Security Notes

Never commit .env files or API keys.

Store secrets in secure environment variables or secret managers.

MCP Server Β· Populars

MCP Server Β· New

    Lissy93

    bug-bounties

    βš”οΈ A compiled list of companies who have active programs for responsible disclosure. MCP-enabled.

    Community Lissy93
    samvallad33

    Vestige

    Cognitive memory for AI agents β€” FSRS-6 spaced repetition, 29 brain modules, 3D dashboard, single 22MB Rust binary. MCP server for Claude, Cursor, VS Code, Xcode, JetBrains.

    Community samvallad33
    HarimxChoi

    google-surf-mcp

    ✨Anti-Bot Search MCP: No API Key✨

    Community HarimxChoi
    syncable-dev

    Memtrace

    The missing memory layer for coding agents

    Community syncable-dev
    kunwar-shah

    Claudex

    MCP server with persistent memory + FTS5 search for Claude Code conversation history. Index your ~/.claude/projects/, expose 10 MCP tools, browse via web UI. MIT-licensed.

    Community kunwar-shah