Convert Any OpenAPI V3 API to MCP Server
https://github.com/automation-ai-labs/mcp-linkEvery time you want to connect an API to your AI agent, you write another MCP server wrapper. GitHub integration? Custom server. Slack functionality? Another server. Notion access? Yet another server.
You know there's a better way.
MCP Link converts any OpenAPI v3 specification into a fully functional MCP server - automatically mapping every endpoint, parameter, and authentication method without touching your original API code.
You're building the same boilerplate over and over:
Meanwhile, thousands of well-documented APIs already exist with OpenAPI specs - from GitHub and Slack to your internal microservices.
Point it at any OpenAPI spec and get a production-ready MCP server:
# Convert GitHub's API in seconds
go run main.go serve --port 8080 &
# Now GitHub's entire API is available to your AI agent
Or use the hosted version instantly:
{
"mcpServers": {
"github": {
"url": "https://mcp-link.vercel.app/links/github"
}
}
}
Your AI agent now has complete GitHub access - repositories, issues, pull requests, everything - with zero custom code.
Instead of 2-3 days per API integration, you get:
+/users/*:GET;-/internal/**Internal Tool Integration: Your company has 12 microservices with OpenAPI specs. Instead of writing 12 MCP servers, run one MCP Link instance that converts them all.
Multi-API Workflows: Build an AI agent that creates GitHub issues from Slack messages, updates Notion databases, and sends status updates - using pre-built API integrations instead of custom wrappers.
Rapid Prototyping: Test AI agent interactions with any public API immediately. Want to experiment with Stripe's API? Point MCP Link at their OpenAPI spec and start building.
Legacy API Modernization: Your older REST APIs get instant AI agent compatibility without modifying existing code - just generate or update their OpenAPI specs.
MCP Link runs as a standard HTTP server that speaks MCP protocol. Your existing AI agent setup works unchanged:
{
"mcpServers": {
"internal-apis": {
"url": "http://localhost:8080/sse?s=./internal-api.yaml&u=https://api.internal.com&h=Authorization:Bearer"
}
}
}
The server handles all the protocol translation, authentication forwarding, and error handling. Your agent sees clean MCP tool calls; your API sees normal REST requests.
OpenAPI specs already contain everything needed for AI integration:
MCP Link reads these specs and generates the MCP interface automatically, ensuring nothing gets missed and everything stays in sync when APIs evolve.
Ready to stop building the same integration code repeatedly? Clone the repo and point it at your first OpenAPI spec - you'll have a working MCP server before your coffee gets cold.