MCP server that wraps any stdio-based Language Server (gopls, rust-analyzer, pyright, etc.) and exposes semantic code-navigation tools (definition, references, rename, diagnostics …) to Model Context Protocol (MCP) enabled LLM clients.
https://github.com/isaacphi/mcp-language-serverStop explaining your codebase to AI assistants. This MCP server connects any stdio-based language server (gopls, rust-analyzer, pyright, clangd) directly to MCP-enabled LLM clients, giving them the same semantic understanding of your code that your IDE has.
When you ask Claude or other AI assistants about your code, they're working with raw text. They can't jump to definitions, find all references, or understand the actual structure of your codebase. You end up copy-pasting code snippets, explaining context, and watching AI make suggestions based on incomplete information.
Your IDE already has this solved through language servers. The Language Server Protocol gives editors precise semantic information about code - definitions, references, type information, diagnostics. But your AI assistant? It's stuck parsing strings.
This MCP server acts as a translator between the Language Server Protocol and the Model Context Protocol. Your AI assistant suddenly gains access to the same semantic tools your IDE uses:
Instead of this conversation:
"This function isn't working. Here's the code... [paste 50 lines]"
"Can you show me where this function is called?"
"Here are the call sites... [paste more code]"
You get this:
"The
ProcessOrderfunction has a bug"AI immediately sees the function definition, finds all 12 call sites across your codebase, identifies the issue in the context of how it's actually used, and suggests a fix that won't break existing callers.
For code reviews: AI can understand the full impact of changes by following references and dependencies, not just the diff you show it.
For refactoring: AI suggestions become surgical rather than speculative because it can see the complete call graph and type relationships.
Supports any language server that communicates over stdio:
Each language server brings its own semantic understanding. The AI assistant gets the same level of code intelligence that took years to develop in these tools.
Install the server: go install github.com/isaacphi/mcp-language-server@latest
Install your language server (if you don't have it):
# Go
go install golang.org/x/tools/gopls@latest
# Rust
rustup component add rust-analyzer
# Python
npm install -g pyright
Configure your MCP client (Claude Desktop example):
{
"mcpServers": {
"language-server": {
"command": "mcp-language-server",
"args": [
"--workspace", "/path/to/your/project",
"--lsp", "gopls"
]
}
}
}
Start your MCP client - The AI assistant now has semantic understanding of your codebase
Your existing language server setup works unchanged. This just gives your AI the same view your IDE already has.
The gap between what your development tools understand about your code and what AI assistants can see is holding back AI-assisted development. Language servers solved semantic code analysis years ago - this just makes that intelligence available to AI.
With 546+ stars and active development, this isn't experimental tooling. It's a production-ready bridge that turns AI assistants from text processors into code-aware development partners.
Stop teaching AI your codebase from scratch every conversation. Give it the same semantic foundation your IDE uses, and watch AI-assisted development actually work at the speed of thought.