FastMCP-compatible server that exposes Grafana Loki querying capabilities via MCP.
https://github.com/tumf/grafana-loki-mcpStop context-switching between your AI assistant and Grafana dashboards. This MCP server brings your Loki logs directly into conversational workflows, letting you debug and investigate issues without leaving your AI tool.
You're already using AI assistants for code help, but when production breaks, you're back to clicking through Grafana. What if you could just ask "show me errors from the payment service in the last hour" and get formatted results instantly?
This MCP server connects your Loki instance to any MCP-compatible AI assistant, turning log queries into natural conversations. No more dashboard hunting when you're in the middle of debugging.
Direct Log Access: Query Loki using natural language through your AI assistant. Ask for specific errors, trace particular services, or analyze patterns across time ranges.
Smart Formatting: Results come back formatted for readability - JSON, markdown, or plain text. Set character limits per line to avoid overwhelming responses with verbose logs.
Flexible Time Queries: Use Grafana-style relative times (now-6h, now-1d) or precise timestamps. Perfect for incident response when every minute counts.
Label Intelligence: Discover available labels and their values programmatically. Great for exploring unfamiliar services or building dynamic queries.
Incident Response: "Show me all ERROR logs from the auth service in the last 30 minutes" - get immediate results without opening Grafana while you're troubleshooting.
Code Review Context: Ask for recent logs from a specific component you're reviewing to understand its runtime behavior and error patterns.
Performance Investigation: Query logs around specific timestamps when monitoring alerts fired, correlating application behavior with system metrics.
Service Discovery: Explore label values for unfamiliar services to understand their logging patterns and available metadata.
Install via pip and add to your MCP configuration:
pip install grafana-loki-mcp
{
"mcpServers": {
"loki": {
"command": "uvx",
"args": ["grafana-loki-mcp", "-u", "GRAFANA_URL", "-k", "GRAFANA_API_KEY"]
}
}
}
Your AI assistant immediately gains four new capabilities:
query_loki - Run LogQL queries with flexible time rangesget_loki_labels - Discover available log labelsget_loki_label_values - Get values for specific labelsformat_loki_results - Transform results into readable formatsThe server handles authentication, connection management, and response formatting. You focus on asking the right questions.
Uses FastMCP for reliable performance and supports both stdio and SSE transports. The server respects Grafana API rate limits and includes sensible defaults for log line limits to prevent overwhelming responses.
Set max_per_line to truncate verbose log entries, keeping AI responses focused and readable. Perfect for scanning large volumes of logs without losing context.
Ready to make your logs conversational? Your debugging workflow just got a major upgrade.