Dockerized MCP server that converts any Swagger/OpenAPI spec into a ready-to-use Model Context Protocol (MCP) toolset, enabling AI agents to call the target API.
https://github.com/ckanthony/openapi-mcpStop writing custom MCP tools for every API your AI agents need to access. OpenAPI-MCP reads any Swagger/OpenAPI specification and automatically generates a complete MCP toolset - no coding required.
You want your AI agent to call external APIs, but creating MCP tools manually for each endpoint is tedious. You're looking at hours of boilerplate code for parameter validation, request formatting, and response handling. Meanwhile, that API already has a perfectly good OpenAPI spec that describes everything your tooling needs to know.
OpenAPI-MCP solves this by doing the heavy lifting for you. Point it at any OpenAPI specification, and you get a fully functional MCP server that your AI agents can use immediately.
Instant API Access: Got an OpenAPI spec? You're 30 seconds away from AI integration. No custom tool development, no parameter mapping, no request validation code.
Security Built-In: API keys stay in the server environment where they belong. Your AI agent never sees the credentials, but can still make authenticated requests through the proxy.
Works with Existing Documentation: Every modern API publishes OpenAPI specs. This tool makes thousands of APIs instantly available to your agents without additional work.
Weather and Location Services: Connect to Weatherbit, OpenWeatherMap, or any weather API. Your agent can check conditions, forecasts, and alerts by simply referencing the location.
E-commerce Integration: Shopify, WooCommerce, and other platforms with OpenAPI specs become immediately accessible for inventory checks, order management, and customer data.
Internal Company APIs: If your team documents APIs with OpenAPI (which you should), those APIs can now be consumed by AI agents without custom integration work.
Third-Party Services: Payment processors, CRM systems, marketing tools - if they have OpenAPI documentation, they're compatible.
The fastest path is using the pre-built Docker image:
# For a public API like Petstore
docker run -p 8080:8080 --rm \
ckanthony/openapi-mcp:latest \
--spec https://petstore.swagger.io/v2/swagger.json
# For APIs requiring authentication
docker run -p 8080:8080 --rm \
-e API_KEY="your_key_here" \
ckanthony/openapi-mcp:latest \
--spec /path/to/spec.json \
--api-key-env API_KEY \
--api-key-name Authorization \
--api-key-loc header
The server immediately starts generating MCP tools from your spec. Your AI agent can now call any endpoint defined in that OpenAPI document.
Don't want to expose every endpoint? Use the filtering options:
# Only include specific operations
--include-op getUserById,updateUser
# Exclude sensitive endpoints
--exclude-tag admin,internal
# Include only certain API sections
--include-tag weather,location
This gives you fine-grained control over which API operations your agents can access.
The tool is built for modern deployment workflows. Mount your specs as volumes, pass API keys through environment variables, and scale horizontally if needed. The container includes everything required - no external dependencies or complex setup.
OpenAPI-MCP handles the complex scenarios too:
You get all this functionality automatically based on what's defined in your OpenAPI specification.
Once running, your MCP-compatible clients (like Cursor) can discover and use the generated tools immediately. The server exposes standard MCP endpoints, so integration is seamless with existing agent frameworks.
The productivity gain is immediate - instead of spending hours building custom API integrations, you're connecting to external services in minutes. Your agents get reliable, well-documented API access, and you get back to building features that matter.