Model Context Protocol (MCP) Server for the Keboola Platform – exposes Keboola storage, SQL, jobs and metadata as MCP tools consumable by Claude, Cursor, LangChain, CrewAI, etc.
https://github.com/keboola/mcp-serverStop copying and pasting data between your Keboola project and Claude. Stop writing custom integrations just to let your AI assistant query tables or trigger jobs. The Keboola MCP Server connects your AI tools directly to your entire data platform—no API wrappers, no middleware, no friction.
You've got terabytes of data in Keboola and powerful AI assistants that could analyze it, build transformations, and manage pipelines. But there's always that annoying gap: your AI can't actually see your data or do anything with your platform. So you end up as the middleman, running queries manually, copying results into chat windows, and explaining your schema over and over.
That workflow ends here.
The Keboola MCP Server turns your entire Keboola project into callable tools for Claude, Cursor, and other MCP-compatible AI clients. Your AI assistant can now:
This isn't just read access—your AI can actually manage your data platform.
Data Analysis Sessions:
You: "Find correlations between customer churn and support ticket volume"
Claude: *queries customer tables, analyzes ticket data, runs correlation analysis, creates visualizations*
Pipeline Development:
You: "Create a transformation that aggregates daily sales by region"
Claude: *inspects sales tables, writes SQL transformation, creates the component configuration, tests with sample data*
Operational Monitoring:
You: "Check if the nightly ETL jobs completed successfully"
Claude: *retrieves job status, identifies any failures, suggests debugging steps based on error logs*
Your AI assistant becomes a data team member who actually understands your infrastructure.
Thanks to uvx, there's no installation process. Configure your AI client once, and it automatically downloads and runs the MCP server when needed:
{
"mcpServers": {
"keboola": {
"command": "uvx",
"args": ["keboola_mcp_server", "--api-url", "https://connection.YOUR_REGION.keboola.com"],
"env": {
"KBC_STORAGE_TOKEN": "your_token",
"KBC_WORKSPACE_SCHEMA": "your_schema"
}
}
}
}
Add this to Claude Desktop or Cursor, restart, and your AI immediately has full platform access.
This isn't a toy integration—it covers your entire Keboola workflow:
For Data Engineers: Your AI assistant can now review pipeline configurations, debug failed jobs, and optimize transformations with full context of your data architecture.
For Analysts: Skip the context-switching between SQL tools and AI chats. Your assistant can query data, spot patterns, and generate insights without you copying table schemas or query results.
For DataOps: Automate operational tasks through natural language. "Check yesterday's job failures and suggest fixes" becomes a single conversation instead of manual investigation.
Works with the AI tools you're already using:
Your AI assistant is now a native part of your data stack.
The days of manual data-AI workflows are over. Your AI tools should work with your data platform as seamlessly as they work with your codebase. This MCP server makes that reality.