MCP server implementation for Kibela API integration, enabling LLMs to interact with Kibela content.
https://github.com/kiwamizamurai/mcp-kibela-serverStop context-switching between your AI assistant and Kibela when you need team knowledge. This MCP server gives LLMs direct access to your Kibela workspace, turning your scattered team docs into queryable, actionable intelligence.
Your team's best practices, API docs, and tribal knowledge live in Kibela. But when you're coding and need that information, you're stuck copying and pasting between tools or losing context while you search.
This MCP server eliminates that friction. Your LLM can now search your team's Kibela content, retrieve specific notes, and even interact with the platform—all without leaving your development environment.
Intelligent Code Reviews: "Search our coding standards for exception handling patterns" returns actual team guidelines, not generic advice.
Contextual Documentation: Ask about your team's deployment process and get the actual runbook from Kibela, complete with recent updates and comments.
Onboarding Acceleration: New team members can query your knowledge base naturally through their AI assistant instead of hunting through folders.
Living Architecture Decisions: Reference architectural decision records (ADRs) stored in Kibela during design discussions without breaking flow.
A typical workflow looks like this:
# Quick setup - works with existing Kibela credentials
npx @kiwamizamurai/mcp-kibela-server
Then your LLM can handle requests like:
The server handles Kibela's GraphQL API complexity, rate limiting, and authentication so your LLM integration stays simple.
This isn't just another content connector. It's specifically designed for how development teams actually use knowledge bases:
Drop it into Cursor with a simple config:
{
"mcpServers": {
"kibela": {
"command": "npx",
"args": ["-y", "@kiwamizamurai/mcp-kibela-server"],
"env": {
"KIBELA_TEAM": "your-team",
"KIBELA_TOKEN": "your-token"
}
}
}
}
Docker users get the same simplicity with container isolation. The server handles all the MCP protocol details and Kibela API intricacies.
Instead of asking your LLM generic questions and getting generic answers, you're now querying your team's actual knowledge. The difference between "How should I handle errors in this codebase?" and getting your team's specific error handling patterns from Kibela is significant.
This server turns your Kibela workspace from a reference you occasionally check into an active participant in your development process.
Get Started: npx @kiwamizamurai/mcp-kibela-server with your Kibela credentials and start making your team's knowledge work for you.