AI-driven Mini Kubernetes Dashboard & MCP server. Single-binary deployment with multi-cluster management, 49+ built-in MCP tools, AI (Qwen/Coder, DeepSeek) assisted diagnostics, CRD/Helm support and more.
https://github.com/weibaohui/k8mManaging Kubernetes clusters while juggling AI workflows? You're probably switching between kubectl, dashboards, and AI chat interfaces constantly. K8M eliminates that context switching by being both a lightweight K8s dashboard AND a production-ready MCP server with 49+ built-in tools.
Unlike traditional Kubernetes dashboards that just show you cluster state, K8M transforms your AI models into capable Kubernetes operators. When Claude or any MCP-compatible AI needs to interact with your clusters, K8M provides the secure, permission-aware bridge.
Here's what sets it apart:
Instead of manually writing kubectl commands, your AI conversations become:
You: "Scale down the frontend deployment in staging to save costs"
AI: [Uses K8M MCP tools to check current replicas, then scales down]
AI: "Scaled frontend-deployment from 5 to 2 replicas in staging cluster"
You: "Why is the payment service pod crashing?"
AI: [Fetches logs, analyzes events, checks resource constraints via MCP]
AI: "Memory limit exceeded - increasing from 512Mi to 1Gi and redeploying"
The AI doesn't just tell you what to do - it actually executes the fixes through K8M's MCP interface.
# Download and run - that's it
./k8m --kubeconfig ~/.kube/config
# Access dashboard at localhost:3618
# MCP server available at localhost:3618/mcp
Or deploy via Docker:
services:
k8m:
image: registry.cn-hangzhou.aliyuncs.com/minik8m/k8m
ports:
- "3618:3618"
volumes:
- ~/.kube/config:/root/.kube/config:ro
- ./data:/app/data
Multi-Cluster Management: Register multiple clusters, AI operations work across all of them with proper context switching.
Granular Permissions:
Production-Ready Logging: Every MCP call is logged with user context, timestamp, and operation details - perfect for audit trails.
Built-in AI Models: Qwen2.5-Coder-7B and DeepSeek-R1-Distill included, or bring your own Ollama/OpenAI setup.
K8M doesn't replace your existing tools - it enhances them. Keep using kubectl, Helm, and your CI/CD pipelines. But now your AI assistants can:
Works with any MCP-compatible AI client:
The standardized MCP protocol means you're not locked into any specific AI platform.
http://localhost:3618/mcpK8M transforms AI from a helpful assistant into a capable Kubernetes operator. Instead of copying and pasting kubectl commands from AI responses, your AI directly executes cluster management tasks through a secure, permission-aware interface.
Perfect for teams already using MCP who want to extend their AI capabilities into Kubernetes management - or for Kubernetes teams looking to add intelligent automation to their workflows.