MCP server (Model Context Protocol) for managing and interacting with the Flowcore Platform
https://github.com/flowcore-io/mcp-flowcore-platformConnect your AI assistants directly to your Flowcore Platform infrastructure. This MCP server bridges the gap between AI tools and your data pipelines, giving you conversational control over your entire Flowcore ecosystem.
If you're already using Flowcore for data processing and pipeline management, you know the friction of jumping between Claude/ChatGPT conversations and the Flowcore dashboard. You'll sketch out ideas with AI, then switch to the platform to check pipeline status, query data, or manage flows.
This MCP server eliminates that workflow interruption. Your AI assistant gets direct access to your Flowcore infrastructure, so you can manage pipelines, query data, and troubleshoot issues without leaving your conversation.
Conversational Infrastructure Management: Ask your AI to check pipeline status, trigger flows, or investigate data issues. No more "let me go check the dashboard and get back to you."
Context-Aware Debugging: When discussing performance issues or data anomalies, your AI can pull real-time information from your Flowcore environment to provide informed suggestions.
Automated Operations: Set up AI-driven monitoring and management workflows. Your assistant can proactively check system health and alert you to issues before they impact production.
Data Pipeline Troubleshooting: "Why is our customer data pipeline running slowly today?" Your AI can immediately check pipeline metrics, identify bottlenecks, and suggest optimizations.
Flow Management: "Trigger a full refresh of our analytics pipeline and let me know when it completes." Handle routine operations without context-switching to the platform.
System Monitoring: During development sprints, ask your AI to monitor specific data flows and summarize their performance at the end of each day.
Data Quality Checks: "Check if our latest data ingestion completed successfully and show me any anomalies." Get instant insights without navigating through multiple dashboard screens.
# Run directly with npx
npx @flowcore/platform-mcp-server --username <username> --pat <pat>
# Or install globally
npm install -g @flowcore/platform-mcp-server
platform-mcp-server --username <username> --pat <pat>
Add it to your MCP-compatible AI client configuration, and you're ready to start managing your Flowcore infrastructure conversationally.
The server handles authentication, API translation, and error handling, so your AI gets clean, structured access to your Flowcore resources without you having to manage the integration complexity.
Performance Note: For high-volume data scenarios, the team also offers a local read model MCP server that reduces token usage and speeds up queries by orders of magnitude.
Requires Flowcore Platform account with valid PAT (Personal Access Token). Compatible with Claude Desktop, ChatGPT, and other MCP-enabled AI clients.