🦀 MCP server that fetches current Rust crate docs, embeds them with OpenAI, and exposes an MCP tool (query_rust_docs) so AI coding assistants can answer crate-specific questions with up-to-date information.
https://github.com/Govcraft/rust-docs-mcp-serverYou know the drill: your AI coding assistant confidently suggests using a deprecated tokio API from 2021, or recommends serde patterns that haven't worked since version 0.8. Their training data is months or years behind the rapidly evolving Rust ecosystem, and you end up spending more time correcting AI suggestions than writing code yourself.
The Rust Docs MCP Server fixes this by giving your AI assistant a direct line to current crate documentation. Instead of relying on stale training data, your assistant can query live docs and get accurate, up-to-date information before suggesting code.
Rust moves fast. Crates update APIs, deprecate functions, and introduce new patterns regularly. When your AI assistant suggests reqwest::Client::new() but the current version requires reqwest::Client::builder().build(), you lose time debugging and second-guessing every suggestion.
This MCP server changes that dynamic. Your assistant can ask "How do I make a GET request with reqwest 0.12?" and receive an answer based on the actual current documentation, not some training snapshot from months ago.
Accurate first-try code: Instead of generating code that might work, your assistant generates code that uses current APIs correctly.
Reduced context switching: No more alt-tabbing to docs.rs to verify what your assistant suggested. The context comes directly from the source.
Confidence in suggestions: When your assistant prefixes responses with "From reqwest docs:", you know the information is current and reliable.
Crate-specific expertise: Run multiple server instances for different crates. Your assistant can simultaneously access current docs for tokio, serde, clap, and any other crates in your project.
The server does the heavy lifting once:
Then it provides lightning-fast responses:
Get the binary from releases or build from source:
git clone https://github.com/Govcraft/rust-docs-mcp-server.git
cd rust-docs-mcp-server
cargo build --release
Set your OpenAI API key and run:
export OPENAI_API_KEY="sk-..."
rustdocs_mcp_server "[email protected]"
The first run downloads docs and generates embeddings (one-time cost). Subsequent runs use cached data and start instantly.
Configure multiple instances in your MCP client. Here's a Claude Desktop example:
{
"mcpServers": {
"rust-docs-tokio": {
"command": "rustdocs_mcp_server",
"args": ["[email protected]"]
},
"rust-docs-serde": {
"command": "rustdocs_mcp_server",
"args": ["serde@^1.0"]
},
"rust-docs-reqwest": {
"command": "rustdocs_mcp_server",
"args": ["[email protected]"]
}
}
}
Your assistant can now query any of these crates for current documentation context before suggesting code.
Feature-specific docs: Some crates require specific features enabled. Handle this easily:
rustdocs_mcp_server "[email protected]" -F runtime-tokio-hyper-rustls
Version targeting: Pin to specific versions or use semantic versioning:
rustdocs_mcp_server "serde@^1.0" # Latest 1.x
rustdocs_mcp_server "tokio@=1.25.0" # Exact version
Smart caching: The server caches by crate name, version requirement, AND feature set. Different feature combinations get separate caches, ensuring accuracy.
Instead of this frustrating cycle:
You get this smooth workflow:
This server turns your AI assistant from a sometimes-helpful suggestion engine into a reliable pair programming partner with always-current Rust knowledge.
The 110 stars and growing community prove this solves a real problem. Stop fighting outdated AI suggestions and start coding with confidence.