Model Context Protocol Servers – a monorepo containing reference implementations (‘Everything’, ‘Fetch’, ‘Filesystem’, ‘Git’, ‘Memory’, ‘Sequential Thinking’, ‘Time’, etc.) that showcase how to build and run MCP servers with the official SDKs.
https://github.com/modelcontextprotocol/serversStop building custom integrations from scratch. The Model Context Protocol (MCP) Servers repository is your complete toolkit for connecting AI agents to any external system—from databases and APIs to file systems and cloud services.
When you're building AI agents that need to interact with real systems, you face the same integration challenges every time: authentication, data formatting, error handling, and security. MCP solves this by providing a standardized protocol that lets LLMs securely access tools and data sources through purpose-built servers.
This repository contains the definitive collection of MCP implementations—both official reference servers from Anthropic and over 500 community-built servers covering virtually every integration you'll need.
7 Official Reference Servers that demonstrate MCP best practices:
500+ Community Servers covering every major platform and service:
Database Integration in 3 Lines of Code:
# Start PostgreSQL MCP server
npx -y @modelcontextprotocol/server-postgres postgresql://localhost/mydb
# Configure in Claude Desktop
{
"mcpServers": {
"postgres": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-postgres", "postgresql://localhost/mydb"]
}
}
}
Now your AI agent can query your database, inspect schemas, and analyze data directly through natural language.
Multi-Service Workflow:
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/project/files"]
},
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"],
"env": {"GITHUB_PERSONAL_ACCESS_TOKEN": "your_token"}
},
"git": {
"command": "uvx",
"args": ["mcp-server-git", "--repository", "/project"]
}
}
}
Your agent can now read project files, analyze Git history, and manage GitHub issues—all in one conversation.
Zero Custom Integration Code: Use battle-tested implementations instead of building API wrappers from scratch. Each server handles authentication, rate limiting, and error handling.
Instant AI Agent Capabilities: Transform any LLM into a capable agent with access to your entire tech stack. Works with Claude Desktop, Cursor, VS Code extensions, and custom MCP clients.
Production-Ready Security: Official servers include configurable access controls, secure credential handling, and audit logging. Community servers follow the same security patterns.
Extensible Architecture: Start with existing servers and customize them, or use the reference implementations as templates for your own integrations.
Claude Desktop: The most popular MCP client—configure servers through simple JSON and start chatting with your data immediately.
Cursor & VS Code: Integrate MCP servers directly into your IDE for AI-assisted development with full context of your codebase and connected systems.
Custom Applications: Use the official SDKs (TypeScript, Python, C#, Java, Kotlin) to build your own MCP clients and embed AI agent capabilities into any application.
This isn't just Anthropic's reference implementation—it's a thriving ecosystem with contributions from major companies like GitHub, Google, Microsoft, AWS, and hundreds of individual developers.
Official Integrations from platforms like:
High-Quality Community Servers with comprehensive documentation, testing, and maintenance.
Use Any Server Immediately:
# Start the Memory server for persistent knowledge
npx -y @modelcontextprotocol/server-memory
# Start the Filesystem server for file operations
npx -y @modelcontextprotocol/server-filesystem /path/to/files
# Start any Python server
uvx mcp-server-git --repository /path/to/repo
Build Your Own Server: The repository includes comprehensive examples in multiple languages, development frameworks like FastMCP and EasyMCP, and detailed documentation for implementing custom servers.
Deploy at Scale: Community solutions include Docker containers, cloud hosting platforms, and management tools for running multiple MCP servers in production.
If you're building AI agents, you need structured access to external systems. You could spend months building custom integrations for each service your agents need to access, or you can leverage this repository's collection of proven, secure, and well-documented MCP servers.
With over 55,000 GitHub stars and active contributions from major tech companies, this has become the de facto standard for AI agent integrations. Whether you're prototyping a simple automation or building production AI applications, these MCP servers provide the reliable foundation you need.
Start with the official reference servers to understand the patterns, then explore the community ecosystem to find integrations for every service in your stack. Your AI agents will thank you.