Model Context Protocol server that lets LLMs query and modify MongoDB databases with smart ObjectId conversion, optional read-only mode, and tooling for Claude Desktop, Windsurf, Cursor, etc.
https://github.com/kiliczsh/mcp-mongo-serverStop copying and pasting database queries between your terminal and Claude. MCP Mongo Server creates a direct bridge between your AI assistant and MongoDB, letting you query, analyze, and modify your data through natural conversation.
You're already using Claude, Cursor, or Windsurf for development. But when you need to work with your MongoDB data, you're stuck switching between tools - checking schemas in one place, running queries in another, then explaining the results back to your AI assistant.
MCP Mongo Server eliminates that friction. Your AI assistant can now directly query your collections, understand your schema, run aggregations, and even help you optimize queries - all without leaving the conversation.
MongoDB's ObjectId conversion is a constant source of bugs. You know the drill - LLMs return string IDs, but MongoDB expects ObjectIds, so you spend time debugging conversion errors.
This server handles ObjectId conversion automatically:
// LLM says: "Find user with ID 507f1f77bcf86cd799439011"
// Server automatically converts to ObjectId behind the scenes
{
collection: "users",
filter: { _id: "507f1f77bcf86cd799439011" } // Just works
}
Three conversion modes give you control:
auto: Converts based on field names (default)force: Converts all string ID fields to ObjectIdnone: No conversion for edge casesConnect to production databases without fear. Read-only mode prevents any write operations and uses MongoDB's secondary read preference for better performance.
# Safe production querying
npx -y mcp-mongo-server mongodb://prod-cluster/mydb --read-only
Perfect for letting your AI assistant analyze production data, debug issues, or generate reports without risk.
Database Debugging: "Show me all failed orders from last week" - your AI can query directly and analyze patterns in the data.
Schema Exploration: Working with an unfamiliar collection? Your AI can examine the schema and suggest optimal queries.
Aggregation Pipeline Building: Describe what you want to analyze, and let your AI build and test complex aggregation pipelines.
Query Optimization: Your AI can run queries with execution plans enabled and suggest performance improvements.
Data Migration Planning: Analyze collection structures and data patterns before planning migrations.
Claude Desktop:
{
"mcpServers": {
"mongodb": {
"command": "npx",
"args": ["-y", "mcp-mongo-server", "mongodb://localhost:27017/mydb"]
}
}
}
Environment Variables (recommended for security):
{
"mcpServers": {
"mongodb": {
"command": "npx",
"args": ["-y", "mcp-mongo-server"],
"env": {
"MCP_MONGODB_URI": "mongodb://localhost:27017/mydb",
"MCP_MONGODB_READONLY": "true"
}
}
}
}
Works identically with Windsurf and Cursor - just add the same configuration to their settings.
Query with execution analysis:
{
collection: "users",
filter: { age: { $gt: 30 } },
projection: { name: 1, email: 1 },
explain: "executionStats" // Get performance insights
}
Aggregation pipelines:
{
collection: "orders",
pipeline: [
{ $match: { status: "completed" } },
{ $group: { _id: "$customerId", total: { $sum: "$amount" } } }
]
}
Document updates (when not read-only):
{
collection: "posts",
filter: { _id: "60d21b4667d0d8992e610c85" },
update: { $set: { published: true } }
}
# One command, start querying
npx -y mcp-mongo-server mongodb://localhost:27017/mydb
The -y flag ensures you always get the latest version. For development, clone the repo and use npm run watch for auto-rebuild.
Use environment variables to keep credentials secure:
export MCP_MONGODB_URI="mongodb://user:pass@cluster/db"
export MCP_MONGODB_READONLY="true"
npx -y mcp-mongo-server
Docker support included for containerized deployments.
MCP servers communicate over stdio, making debugging tricky. Run the included inspector for browser-based debugging:
npm run inspector
Your AI assistant is already powerful for code. Make it powerful for data too. Install MCP Mongo Server and start having conversations about your actual database instead of copying queries back and forth.
The server handles the MongoDB complexity while your AI handles the intelligence - exactly how it should be.