MCP server that lets LLMs search Go packages on pkg.go.dev and stream the results over stdio.
https://github.com/yikakia/godoc-mcp-serverStop context-switching to pkg.go.dev every time your LLM needs Go package information. This MCP server puts the entire Go ecosystem at your AI assistant's fingertips—searchable, cached, and streamed directly into your development conversation.
You're deep in a coding session with your LLM, architecting a new Go service. Your assistant suggests using a specific package, but needs to know the API surface. Or you're comparing HTTP routers and want real documentation. The flow breaks: copy package name, open browser, search pkg.go.dev, read docs, switch back, resume conversation.
godoc-mcp-server eliminates this friction entirely. Your LLM can search Go packages, fetch full documentation, and explore version histories without breaking stride.
Real-time Go ecosystem access: Search across all packages on pkg.go.dev with the same interface your LLM uses for other tools. No more "I don't have current information about that package."
Complete package context: Full documentation, import paths, version information, and usage statistics flow directly into your conversation. Your LLM gets the same information you'd gather manually.
Smart caching: Local cache prevents redundant API calls when exploring related packages or revisiting documentation during the same session.
Subpackage intelligence: The server understands Go's subpackage structure and guides LLMs through proper path construction—critical for complex modules with nested packages.
Architecture discussions: Compare HTTP frameworks by having your LLM fetch and analyze documentation for gin, echo, and fiber in a single conversation. No tab juggling.
Code generation: Generate imports and usage examples with current API information. Your LLM writes code against actual package interfaces, not outdated training data.
Dependency evaluation: Explore package popularity, maintenance status, and API stability before committing to dependencies. Ask questions like "How actively maintained is this package?" and get data-backed answers.
Learning acceleration: Deep dive into unfamiliar packages with guided exploration. Your LLM can walk you through package structure, key types, and usage patterns using live documentation.
The server runs over stdio—no ports, no HTTP servers, just clean MCP communication. Install the binary or build from source:
go install github.com/yikakia/godoc-mcp-server/cmd/godoc-mcp-server@latest
Three core tools handle the Go ecosystem:
searchGoPackage: Find packages by keyword or functionalitygetPackageDocumentation: Retrieve complete docs for specific packageslistPackageVersions: Explore version history and compatibilityThe server includes thoughtful prompt engineering details—descriptions that help LLMs chain tool calls correctly. When searching returns subpackages, the LLM knows how to construct proper import paths for subsequent documentation requests.
Go's ecosystem moves fast. Package APIs evolve, new libraries emerge, and best practices shift. This server keeps your LLM current with the Go world without manual curation or stale context injection.
You're not just adding another MCP server—you're giving your development workflow direct access to Go's collective knowledge. Every package search, documentation lookup, and version check happens in-context, maintaining flow while keeping decisions informed.
The difference between asking "Can you help me choose an HTTP router?" and getting generic advice versus getting current data on gin's 45k+ imports, echo's middleware ecosystem, and fiber's performance characteristics is substantial. This server delivers the latter.