A Model Context Protocol (MCP) server that provides AI-powered semantic search and Q&A over the Vercel AI SDK documentation.
https://github.com/IvanAmador/vercel-ai-docs-mcpStop alt-tabbing between your AI assistant and the Vercel AI SDK documentation. This MCP server brings semantic search and contextual Q&A directly into Claude Desktop, Cursor, or any MCP-compatible client.
You're building with the Vercel AI SDK and need to understand how streamText works with custom parameters, or figure out the right way to handle tool calling. Instead of getting straight answers, you're:
Semantic Documentation Search: Ask "How do I implement streaming with custom stop sequences?" and get precise answers with relevant code examples from the actual Vercel AI SDK docs.
Conversation Context: Follow up with "What about error handling in that scenario?" without re-explaining what you're working on.
Direct Integration: Works inside Claude Desktop, Cursor, or any MCP client - no context switching required.
API Integration Questions: "How do I configure the OpenAI provider with custom headers?" - Get the exact configuration code instead of generic provider setup instructions.
Advanced Usage Patterns: "What's the difference between generateText and streamText for tool calling?" - Understand the specific trade-offs with concrete examples.
Troubleshooting: "Why is my streaming response cutting off early?" - Get targeted debugging steps based on common SDK issues.
Migration Help: "How do I upgrade from the old completion API to the new generateObject?" - See migration patterns with before/after code.
The server crawls and indexes the entire Vercel AI SDK documentation using FAISS for vector similarity search. It provides two query modes:
Three main tools via MCP:
agent-query: Natural language Q&A with session contextdirect-query: Direct similarity search of the documentationclear-memory: Reset conversation context when switching topicsgit clone https://github.com/IvanAmador/vercel-ai-docs-mcp.git
cd vercel-ai-docs-mcp
npm install && npm run build
npm run build:index # Creates the searchable documentation index
Add your Google API key to .env:
GOOGLE_GENERATIVE_AI_API_KEY=your-key-here
Claude Desktop Integration: Add to your MCP config:
{
"mcpServers": {
"vercel-ai-docs": {
"command": "node",
"args": ["ABSOLUTE_PATH_TO_PROJECT/dist/main.js"],
"env": {
"GOOGLE_GENERATIVE_AI_API_KEY": "your-key-here"
}
}
}
}
Cursor Integration:
Add .cursor/mcp.json with the same configuration format.
Instead of breaking flow to hunt through documentation, you ask questions in natural language and get SDK-specific answers with code examples. The conversation context means follow-up questions build on previous answers, making complex implementation discussions actually useful.
Whether you're prototyping with the AI SDK or debugging production issues, having documentation search integrated directly into your AI assistant eliminates the constant context switching that slows down development.
The MCP architecture means this works with any compatible client - Claude Desktop today, but also Cursor and other tools as MCP adoption grows.