TypeScript Model Context Protocol (MCP) server that lets you relay requests from Claude Desktop, LibreChat or any MCP-compatible client to any OpenAI-compatible Chat Completions API (OpenAI, Perplexity, Groq, xAI, PyroPrompts, etc.).
https://github.com/pyroprompts/any-chat-completions-mcpStop settling for just Claude's perspective. This MCP server turns your Claude Desktop into a universal AI client, giving you instant access to OpenAI, Perplexity, Groq, xAI, and dozens of other models—all from within your familiar Claude interface.
You've been living in a single-model bubble. When Claude hits its limits on coding, you switch to cursor. When you need web search, you open Perplexity. When you want lightning-fast responses, you jump to Groq. Each context switch kills your flow and fragments your thinking.
This MCP server eliminates that workflow fragmentation entirely.
Configure once, access everything. Set up multiple providers simultaneously and call them like native Claude tools:
{
"mcpServers": {
"perplexity-search": {
"command": "npx",
"args": ["@pyroprompts/any-chat-completions-mcp"],
"env": {
"AI_CHAT_KEY": "your-perplexity-key",
"AI_CHAT_NAME": "Perplexity",
"AI_CHAT_MODEL": "sonar",
"AI_CHAT_BASE_URL": "https://api.perplexity.ai"
}
},
"groq-speed": {
"command": "npx",
"args": ["@pyroprompts/any-chat-completions-mcp"],
"env": {
"AI_CHAT_KEY": "your-groq-key",
"AI_CHAT_NAME": "Groq",
"AI_CHAT_MODEL": "llama3-70b-8192",
"AI_CHAT_BASE_URL": "https://api.groq.com/openai/v1"
}
}
}
}
Now when you need web-informed answers, ask Claude to "use Perplexity to research this." When you need rapid iteration, "use Groq for quick feedback." All within the same conversation thread.
Model Shopping Without Context Loss: Compare responses from GPT-4, Claude, and Llama side-by-side without copy-pasting between interfaces. Perfect for finding the best model for specific tasks.
Specialized Model Access: Use Perplexity for research, Groq for speed, PyroPrompts for custom fine-tunes, xAI for Grok's unique perspective—all orchestrated by Claude's reasoning.
Cost Optimization: Route different query types to cost-effective models automatically. Use expensive flagship models only when you need their capabilities.
Workflow Continuity: Stay in Claude Desktop while accessing the entire AI ecosystem. No more tab switching or conversation fragmentation.
Beyond Claude Desktop, this integrates seamlessly with LibreChat, giving you the same multi-model access in any MCP-compatible environment. The universal OpenAI-compatible API means virtually any provider works out of the box.
Install via npx (no local setup required):
npx @pyroprompts/any-chat-completions-mcp
Or clone and customize for advanced configurations. The environment variable approach means you can securely manage API keys and swap providers without touching code.
The TypeScript implementation ensures reliability, while the OpenAI SDK compatibility means you're working with battle-tested interfaces that most providers already support.
This isn't just another MCP server—it's a workflow multiplier that removes artificial boundaries between AI models. Your conversations can now span the capabilities of the entire AI ecosystem, all orchestrated through Claude's interface you already know.