Simple MCP (Model Context Protocol) server that lets Claude Desktop and other MCP-capable clients create and chat with OpenAI Assistants.
https://github.com/andybrandt/mcp-simple-openai-assistantStop context-switching between AI tools. This MCP server gives your Claude Desktop direct access to OpenAI's entire assistant ecosystem—including custom GPTs—without leaving your workflow.
You're already using Claude for complex reasoning and analysis. But what happens when you need specialized capabilities that live in OpenAI's assistant ecosystem? Until now, you've been copying and pasting between interfaces, losing context and breaking your flow.
This MCP server eliminates that friction entirely. Claude can now create, configure, and chat with OpenAI assistants directly—treating them as specialized tools in its arsenal rather than separate applications.
OpenAI assistants can take minutes to process complex requests, but MCP clients like Claude Desktop have built-in timeouts that kill long-running operations. Most developers would just accept this limitation.
Instead, this server implements a clever two-stage approach:
This pattern handles OpenAI's processing delays while keeping Claude's interface responsive.
Multi-Model Analysis: Use Claude for initial analysis, then delegate specialized tasks to domain-specific OpenAI assistants without losing context.
GPT Integration: Access your custom GPTs directly from Claude Desktop. That specialized coding assistant or research tool you built? Now it's part of Claude's toolkit.
Workflow Orchestration: Claude can manage conversations across multiple OpenAI assistants, comparing outputs and synthesizing results—essentially becoming your AI project manager.
Long-Running Tasks: Perfect for assistants that need time to process large datasets, generate detailed reports, or perform complex calculations. Start the work, continue with other tasks in Claude, then retrieve results when ready.
Configuration is straightforward—add your OpenAI API key and the server handles the rest:
{
"mcpServers": {
"openai-assistant": {
"command": "python",
"args": ["-m", "mcp_simple_openai_assistant"],
"env": {
"OPENAI_API_KEY": "your-api-key-here"
}
}
}
}
The server provides three core tools: create/manage assistants, handle conversations, and check system health. Simple enough that you'll be running cross-platform AI workflows in minutes.
AI tool fragmentation is real. You have capabilities scattered across different platforms, each with their own interfaces and context limitations. This server represents a different approach—instead of replacing tools, it connects them.
Claude becomes your central interface while tapping into OpenAI's specialized ecosystem. You get Claude's reasoning capabilities plus access to every custom GPT and assistant you've built or discovered.
The two-stage timeout solution alone makes this worth implementing if you're doing any serious work with AI assistants. No more babysitting long-running tasks or losing progress to arbitrary timeouts.
Ready to connect your AI tools instead of juggling them? This MCP server turns Claude into a hub for your entire AI workflow.