Model Context Protocol (MCP) server implementation that exposes Apache Airflow’s REST API through the MCP standard.
https://github.com/yangkyeongmo/mcp-server-apache-airflowStop context-switching between Claude and your Airflow UI. This MCP server puts your entire Apache Airflow instance at your AI assistant's fingertips—trigger DAGs, check pipeline status, debug failures, and manage configurations through natural conversation.
If you're running data pipelines in Airflow, you know the drill: chat with your team about a data issue, switch to Airflow UI, navigate to the right DAG, check logs, maybe restart a task, then switch back to explain what you found. This server eliminates that friction entirely.
Now you can ask Claude: "Check the status of our daily ETL pipeline and restart any failed tasks from this morning" and get immediate action plus context-aware responses about what went wrong and how to fix it.
This isn't a basic wrapper—it's comprehensive Airflow control through MCP. The server exposes every major Airflow REST API endpoint through standardized tools:
Pipeline Management: List, trigger, pause, and monitor DAG runs with detailed status reporting Task Control: Check task instances, clear failed tasks, update task states, and access XCom data Configuration: Manage variables, connections, and pools without leaving your conversation Debugging: Access logs, import errors, and dataset lineage information System Monitoring: Health checks, statistics, and plugin information
Each operation returns structured data that Claude can interpret and act on, making complex multi-step Airflow operations feel like simple conversations.
Production Incident Response: "The customer data pipeline failed at 3 AM. What went wrong and can you restart it?" Claude checks the DAG run, identifies the failed task, examines logs, and triggers a retry—all while explaining the root cause.
Daily Pipeline Reviews: "Show me yesterday's ETL performance and highlight any concerning patterns." Get comprehensive status reports across all your critical DAGs without manually checking each one.
Environment Management: "Update the database connection string for the staging environment and test it." Claude modifies the connection, runs the test, and confirms everything works.
Data Quality Monitoring: "Check if our data validation DAGs caught any issues in the last 24 hours and show me the specific failures." Instant visibility into data quality without digging through logs.
Scheduled Maintenance: "Pause all non-critical DAGs for the next 2 hours while we update the database." Bulk operations that would take multiple UI clicks become single requests.
Add to your Claude Desktop config:
{
"mcpServers": {
"mcp-server-apache-airflow": {
"command": "uvx",
"args": ["mcp-server-apache-airflow"],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_USERNAME": "your-username",
"AIRFLOW_PASSWORD": "your-password"
}
}
}
}
Want to limit scope? Use the --apis flag to expose only specific API groups:
uvx mcp-server-apache-airflow --apis "dag,dagrun,taskinstance"
This server uses the official Apache Airflow client library, ensuring compatibility with your existing Airflow setup. It handles authentication, error responses, and maintains the same security model as your Airflow instance.
The comprehensive API coverage means you're not limited to basic operations—access advanced features like dataset lineage, XCom entries, and event logs through the same conversational interface.
For teams managing complex data pipelines, this transforms Airflow from a tool you visit into a system that actively participates in your workflow discussions. Your AI assistant becomes a knowledgeable team member who can actually take action on your Airflow infrastructure.