k8s-AIassistant
A minimal setup to run a Kubernetes MCP server and a UI Plugin backend with Docker Compose.
Overview
- MCP Server (
mcp-server-kubernetes
)- Exposes an HTTP SSE transport (Server-Sent Events) for MCP on port 3000.
- Talks to your Kubernetes cluster using your kubeconfig.
- UI Plugin Backend (
ui-plugin-example/pkg/top-level-product
)- Node/Express service on port 8055.
- Connects to an Ollama-compatible OpenAI endpoint and the MCP server.
Prerequisites
- Docker and Docker Compose
- A valid kubeconfig on the host (e.g.
/home/<user>/.kube/config
) - An accessible Ollama/OpenAI-compatible endpoint
Quick Start
note: mcp server not support multicluster so you have to chang the KUBECONFIG PATH to
- Set host user and group for the container to read your kubeconfig:
export UID=$(id -u)
export GID=$(id -g)
chmod 644 /home/<user>/.kube/config
- Build and start services:
docker compose build
docker compose up -d
- Verify:
- MCP SSE:
curl http://localhost:3000/sse
- UI Plugin:
curl http://localhost:8055/health
Configuration
- Defined in
docker-compose.yml
. - Service:
mcp
(port 3000)- Environment:
ENABLE_UNSAFE_SSE_TRANSPORT=true
HOST=0.0.0.0
PORT=3000
KUBECONFIG_PATH=/kube/config
- Volume: bind-mount host kubeconfig
:/kube/config:ro
- User:
${UID}:${GID}
(set from host) to avoid permission issues
- Environment:
- Service:
ui-plugin
(port 8055)- Environment:
MCP_BASE=http://mcp:3000
(service name inside the compose network)- `OLLAMA_BASE=
MODEL_NAME=gpt-oss:20b
- Environment:
Adjust OLLAMA_BASE
, MODEL_NAME
, kubeconfig path, and ports to match your environment.
Demo
Screenshot demo:
Project Layout
- MCP server source:
mcp-server-kubernetes/
- UI plugin backend:
ui-plugin-example/pkg/top-level-product/pages/server.js
- UI plugin Dockerfile:
ui-plugin-example/pkg/top-level-product/Dockerfile