MCP Toolbox for Databases – an open-source MCP server that adds connection pooling, auth, observability and a unified tooling control-plane for SQL/LLM agents.
https://github.com/googleapis/genai-toolboxYour AI agent can write brilliant SQL queries, but actually executing them against your production database? That's where things get messy. Connection pools, authentication, rate limiting, observability – suddenly you're building database infrastructure instead of shipping features.
MCP Toolbox for Databases cuts through this complexity. It's a production-ready MCP server that sits between your AI applications and databases, handling all the operational concerns you don't want to think about.
Building AI agents that interact with databases means solving the same infrastructure problems over and over:
You end up with fragile, hard-to-maintain database code scattered across your AI applications. Every new agent means reimplementing the same plumbing.
MCP Toolbox provides a centralized control plane that your AI agents connect to instead of hitting databases directly. You define your tools once in YAML, and the server handles execution, security, and monitoring.
Here's what a database tool looks like:
sources:
analytics-db:
kind: postgres
host: prod-analytics.company.com
database: analytics
user: agent_reader
tools:
user-activity-query:
kind: postgres-sql
source: analytics-db
description: Get user activity metrics for a date range
parameters:
- name: start_date
type: string
description: Start date (YYYY-MM-DD)
- name: end_date
type: string
description: End date (YYYY-MM-DD)
statement: |
SELECT user_id, COUNT(*) as activity_count
FROM user_events
WHERE event_date BETWEEN $1 AND $2
GROUP BY user_id
ORDER BY activity_count DESC
Your AI agent loads this tool with a few lines of Python:
from toolbox_langchain import ToolboxClient
async with ToolboxClient("http://toolbox.internal:5000") as client:
tools = await client.load_toolset("analytics")
# Pass tools to your LangChain agent
The server includes everything you need for production deployment:
No more writing custom database middleware for each AI application. The server scales horizontally and integrates with your existing observability infrastructure.
The most compelling use case is connecting your IDE's AI assistant directly to your databases. Instead of context-switching between your editor and database tools, you can:
This isn't just convenient – it fundamentally changes how you develop database-backed applications.
The server ships as a single Go binary or container image. For local development:
# Download the binary
curl -O https://storage.googleapis.com/genai-toolbox/v0.7.0/linux/amd64/toolbox
chmod +x toolbox
# Configure your tools.yaml and run
./toolbox --tools-file tools.yaml
For production deployments, the container includes everything needed for Kubernetes or Cloud Run.
The project is backed by Google and actively maintained, with comprehensive documentation and client SDKs for major AI frameworks. At 1,564 stars and growing, it's becoming the standard way to connect AI agents to production databases.
Your AI agents are already smart enough to work with your data. Now give them the infrastructure they deserve.