MCP server that exposes OpenAI’s web-search capability so assistants (Claude, Zed, etc.) can fetch fresh web results during a conversation.
https://github.com/ConechoAI/openai-websearch-mcpYour AI assistant knows a lot, but it doesn't know what happened yesterday. This MCP server changes that by connecting Claude, Zed, and other AI tools directly to OpenAI's web search API, letting them pull fresh information during conversations.
You know the drill: you're deep in a coding session with Claude, need current information about a library update or recent news, and suddenly you're alt-tabbing to Google. This server eliminates that friction entirely.
Instead of breaking your flow to search separately, your assistant can now:
The setup is refreshingly simple:
OPENAI_API_KEY=sk-xxxx uv run --with uv --with openai-websearch-mcp openai-websearch-mcp-install
That's it. The command automatically configures your Claude settings, and you're ready to go. No manual JSON editing, no complex configuration files.
The server gives you granular control over how much context window space to dedicate to search results:
You can also specify location context for geographically relevant results, making it perfect for developers working on location-aware applications.
Currently supports:
The server works with any MCP-compatible assistant, so as the ecosystem grows, your web search capability moves with you.
Debugging Production Issues: When your monitoring shows errors from a third-party API, your assistant can immediately search for known issues, recent outages, or breaking changes without you leaving your terminal.
Library Research: Evaluating new dependencies? Your assistant can compare recent reviews, check GitHub issues, and find performance benchmarks in real-time.
Security Updates: Need to know if that vulnerability affects your stack? Your assistant can search for the latest security advisories and patch information.
API Integration: Working with a new service? Your assistant can pull the most current documentation, examples, and community discussions.
Built with simplicity and reliability in mind:
Install and configure in under a minute:
# One-command setup for Claude
OPENAI_API_KEY=your-key-here uv run --with uv --with openai-websearch-mcp openai-websearch-mcp-install
# Or manual pip installation
pip install openai-websearch-mcp
Add to your MCP configuration:
"mcpServers": {
"openai-websearch-mcp": {
"command": "uvx",
"args": ["openai-websearch-mcp"],
"env": {
"OPENAI_API_KEY": "your-api-key-here"
}
}
}
Your assistant now has access to the entire web's knowledge, updated in real-time, without breaking your development flow.
The next time you need current information while coding, you won't need to context switch. Your AI assistant will handle the search and bring back exactly what you need, when you need it.