A Go-based Model Context Protocol (MCP) server that exposes a uniform JSON-RPC/SSE/STDIO interface for managing Kubernetes clusters (resource discovery, CRUD, logs, metrics, events, Helm operations, etc.).
https://github.com/reza-gholizade/k8s-mcp-serverStop context-switching between AI assistants and kubectl commands. This MCP server creates a direct bridge between your AI tools and Kubernetes clusters, letting you manage infrastructure through natural language while maintaining full programmatic control.
Every Kubernetes developer knows the drill: you're debugging an issue, chatting with Claude or ChatGPT about potential solutions, then switching to your terminal to run kubectl commands, copying output back to your AI assistant, and repeating the cycle. This server eliminates that friction entirely.
Instead of explaining your cluster state to an AI, your AI can directly inspect it. Instead of manually running the commands an AI suggests, it can execute them directly. You get the analytical power of modern AI with the operational control of native Kubernetes tooling.
Complete Cluster Visibility for AI
kubectl describe but AI-parseable)Direct Operational Control
Production-Ready Integration
Incident Response "Check the logs for all pods in the payment namespace that have been restarting, and show me the recent events." Your AI can now gather this information in seconds without you running multiple kubectl commands.
Application Deployment "Deploy the latest version of our API with 3 replicas and check if the rollout is healthy." The AI can create deployments, monitor progress, and report back with actual cluster state.
Resource Optimization
"Show me the CPU and memory usage for all nodes, identify the top resource consumers, and suggest optimization strategies." Direct metrics access enables data-driven recommendations.
Helm Operations "Upgrade our monitoring stack to the latest version and rollback if any pods fail to start." Complex operational workflows become conversational.
VS Code (Recommended) One-line installation scripts for macOS, Linux, and Windows. Once configured, Claude directly accesses your cluster through the MCP interface.
CLI Tools Run in stdio mode to integrate with any MCP-compatible command-line tool or custom automation script.
Web Applications SSE mode provides HTTP endpoints for browser-based AI tools or custom web interfaces.
Docker Deployment Production deployments run as containerized services with proper RBAC and security policies.
Built in Go with the official Kubernetes client libraries, this server translates MCP requests into native Kubernetes API calls. It supports multiple cluster contexts, respects existing RBAC policies, and maintains compatibility with standard kubectl configurations.
The tool provides 16 different operations covering the full spectrum of cluster management - from basic resource queries to complex Helm deployments. Each operation is designed to return structured data that AI systems can immediately understand and act upon.
Quick Local Setup:
git clone https://github.com/reza-gholizade/k8s-mcp-server.git
cd k8s-mcp-server
go build -o k8s-mcp-server main.go
./k8s-mcp-server --mode stdio
VS Code Integration:
# macOS/Linux
curl -sSL https://raw.githubusercontent.com/reza-gholizade/k8s-mcp-server/main/scripts/install-vscode-config.sh | bash
Docker Deployment:
docker run -p 8080:8080 -v ~/.kube/config:/home/appuser/.kube/config:ro ginnux/k8s-mcp-server:latest
The server respects your existing kubectl configuration and RBAC permissions - no additional cluster setup required.
Your AI assistant now has the same cluster access you do, but with the ability to correlate information across resources, analyze patterns, and suggest optimizations that would take you considerably longer to identify manually.
This isn't about replacing your Kubernetes expertise - it's about amplifying it through AI that can actually interact with your infrastructure.