Node-based Model Context Protocol (MCP) server that exposes Ragie knowledge-base retrieval through a single “retrieve” tool.
https://github.com/ragieai/ragie-mcp-serverStop copy-pasting context from your docs into AI conversations. This MCP server connects Claude Desktop, Cursor, and other MCP-compatible tools directly to your Ragie knowledge base with a single retrieve command.
You know the drill: you're working with an AI model, need to reference company documentation or project specs, but the model has no clue about your internal knowledge. So you tab-switch, search your knowledge base, copy relevant chunks, paste them back, and hope you grabbed the right context. Repeat this dance dozens of times per day.
The Ragie MCP server eliminates this workflow friction entirely. Your AI tools can now query your knowledge base directly, pulling exactly the context they need without you lifting a finger.
One-Command Setup: Install with npx @ragieai/mcp-server and you're connected to your Ragie knowledge base in seconds.
Smart Retrieval: The retrieve tool doesn't just search—it can rerank results for relevance, apply recency bias for time-sensitive content, and tune the number of results based on your query complexity.
Seamless Integration: Works out of the box with Claude Desktop and Cursor. Configure once, use everywhere.
Flexible Targeting: Point queries at specific partitions in your knowledge base or search globally across all your content.
Code Review with Context: "Explain how this authentication flow aligns with our security guidelines" - the AI can pull your actual security docs without you hunting them down.
Documentation-Aware Development: Building a new feature? Your AI can reference your existing API patterns, coding standards, and architectural decisions directly from your knowledge base.
Support and Troubleshooting: "What's the recommended approach for handling this error based on our troubleshooting guides?" - instant access to your team's collective knowledge.
Onboarding Acceleration: New team members can ask questions that get answered with your actual processes, guidelines, and institutional knowledge rather than generic responses.
For Cursor, drop this into your project's .cursor/mcp.json:
{
"mcpServers": {
"ragie": {
"command": "npx",
"args": ["-y", "@ragieai/mcp-server"],
"env": {
"RAGIE_API_KEY": "your_api_key"
}
}
}
}
For Claude Desktop, add it to your claude_desktop_config.json and restart. Your AI now has direct access to everything in your Ragie knowledge base.
The server accepts practical configuration options that actually impact your workflow:
# Search company-specific documentation with custom description
RAGIE_API_KEY=your_key npx @ragieai/mcp-server \
--description "Search company engineering docs and processes" \
--partition engineering_docs
This isn't just another API wrapper. The server handles MCP protocol details, Ragie API authentication, and result formatting so your AI tools get clean, relevant responses. You focus on asking the right questions, not managing the plumbing.
The retrieve tool accepts natural language queries and returns structured results your AI can immediately understand and act upon. No manual context switching, no copy-paste workflows, no hunting through multiple systems.
Your knowledge base becomes a natural extension of your AI toolkit, accessible through the same interface you're already using for development tasks.
Ready to connect your AI tools to your actual knowledge? Install the Ragie MCP server and stop fighting with context management.