Model Context Protocol (MCP) server that lets LLM apps use Chroma as an embedding-first database (vector + full-text + metadata search).
https://github.com/chroma-core/chroma-mcpStop building AI apps that forget everything between conversations. The Chroma MCP Server bridges the Model Context Protocol with Chroma's production-ready vector database, giving your LLM applications persistent memory, semantic search, and intelligent document retrieval.
Your current AI application probably looks like this: user asks a question, LLM processes it with whatever context fits in the prompt, responds, then forgets everything. That's fine for demos, but production AI applications need to remember, learn, and retrieve relevant information from previous interactions and knowledge bases.
This MCP server solves that by connecting any MCP-compatible LLM application directly to Chroma's vector database - no custom integrations, no API wrappers, just standardized MCP tools that work everywhere.
Flexible Storage Options
Complete Document Lifecycle
Production-Ready Features
Customer Support Knowledge Base
# Create a persistent knowledge base
uvx chroma-mcp --client-type persistent --data-dir ./support-kb
Your AI can now remember every support ticket, documentation update, and resolution across conversations. When a customer asks about a specific issue, the AI retrieves relevant previous cases and solutions automatically.
Research Assistant with Memory Build an AI research assistant that accumulates knowledge over time. Feed it papers, documents, and findings - it will remember everything and make connections across different research sessions.
Code Documentation Helper Connect it to your codebase documentation. The AI can retrieve relevant code examples, API documentation, and implementation patterns based on the specific development questions being asked.
Add this to your Claude Desktop config and you're running:
{
"chroma": {
"command": "uvx",
"args": ["chroma-mcp", "--client-type", "persistent", "--data-dir", "/path/to/your/data"]
}
}
For production applications, connect to Chroma Cloud or your self-hosted instance:
{
"chroma": {
"command": "uvx",
"args": [
"chroma-mcp",
"--client-type", "cloud",
"--tenant", "your-tenant",
"--database", "your-db",
"--api-key", "your-key"
]
}
}
Instead of spending weeks building vector search integration from scratch, you get:
The MCP protocol means this works with Claude Desktop today, and will work with future MCP-compatible applications without any changes to your setup.
# Quick start with ephemeral storage
uvx chroma-mcp
# Production setup with persistent storage
uvx chroma-mcp --client-type persistent --data-dir ./my-ai-memory
Your AI applications can finally remember, learn, and provide contextually relevant responses based on accumulated knowledge. The Chroma MCP Server turns any MCP-compatible LLM into a knowledgeable assistant with perfect recall.
Ready to give your AI applications real memory? The setup takes 30 seconds, and the productivity gains are immediate.