Model Context Protocol (MCP) server for Axiom – lets AI agents run APL queries and list datasets through two MCP tools (queryApl, listDatasets).
https://github.com/axiomhq/mcp-server-axiomStop switching between Claude and your Axiom dashboard. This MCP server connects your AI assistant directly to your observability data, letting you query logs, metrics, and traces through natural conversation.
You're debugging a production issue. Claude is helping you reason through the problem, but every time you need to check actual data, you're copy-pasting queries into Axiom's UI, then bringing results back to continue the conversation. Your flow gets broken, context gets lost, and you're doing the same dance dozens of times per debugging session.
Meanwhile, your observability data sits there - rich with insights - but isolated from your AI workflow.
Direct Data Access: Your AI can run APL queries against your Axiom datasets without leaving the conversation. Ask "show me error rates for the payments service in the last hour" and get actual results, not generic suggestions.
Contextual Analysis: Instead of describing your metrics to Claude, let it see the raw data. It can spot patterns, correlate events, and suggest fixes based on what's actually happening in your systems.
Natural Language Queries: You don't need to remember APL syntax. Ask conversational questions and let Claude translate them into proper queries. "Which endpoints are slowest during peak hours?" becomes a structured APL query automatically.
Incident Response: "Claude, check if we're seeing elevated 5xx errors in the last 30 minutes across all services." It queries your data, analyzes patterns, and suggests which service to investigate first.
Performance Debugging: "Show me response times for user-facing APIs when CPU usage spikes above 80%." Instead of building complex dashboards, you get instant correlation analysis.
Deployment Validation: "Compare error rates before and after the 2pm deployment." Your AI can validate deployments by directly examining your metrics and logs.
Capacity Planning: "What's our 95th percentile response time trend over the past month?" Get data-driven capacity decisions without manual chart analysis.
The server exposes two straightforward tools:
queryApl: Execute APL queries against your datasetslistDatasets: Browse available data sourcesBuilt by the Axiom team in Go, it handles rate limiting, authentication, and error handling. You focus on asking better questions; it handles the data plumbing.
go install github.com/axiomhq/axiom-mcp@latest
Add to your Claude config:
{
"mcpServers": {
"axiom": {
"command": "axiom-mcp",
"args": ["--config", "/path/to/config.txt"],
"env": {
"AXIOM_TOKEN": "your-token",
"AXIOM_URL": "https://api.axiom.co"
}
}
}
}
Now your AI assistant has direct access to your production data. No more context switching, no more manual queries, no more copy-paste debugging sessions.
Your observability data becomes part of your AI conversation, not a separate tool you have to juggle.