Turn any existing HTTP service into an MCP (Model Context Protocol) server through a lightweight reverse-proxy that converts HTTP ↔ MCP (SSE or Streamable HTTP) on the fly.
https://github.com/sxhxliang/mcp-access-pointStop rewriting your perfectly good HTTP services just to work with MCP clients like Cursor, VS Code, or Windsurf. This lightweight reverse proxy converts any existing HTTP service into a fully compliant MCP server without touching a single line of your application code.
You've got HTTP APIs that work great. Now you want to use them with MCP-powered AI tools, but you're stuck between two bad options: rebuild everything for MCP or miss out on the productivity gains.
Neither makes sense when you just need a translation layer.
This Rust-based proxy sits between your HTTP services and MCP clients, handling all the protocol conversion on the fly. Point it at your existing APIs, configure the routes, and you're done - your services now speak MCP.
Built on Pingora - the same ultra-high performance proxy library that handles 40+ million requests per second for Cloudflare's core infrastructure. You're getting enterprise-grade performance in a lightweight package.
Zero Code Changes Required - Your existing HTTP services keep running exactly as they are. No SDK integration, no new endpoints, no disruption to your current architecture.
Multi-Service Management - Configure multiple HTTP services behind a single MCP endpoint. Each service gets its own ID and can be accessed individually or as part of a unified interface.
Production-Ready Performance - Pingora's battle-tested foundation means this proxy won't become your bottleneck, even under heavy load.
Flexible Protocol Support - Works with both SSE and Streamable HTTP transports, so you can integrate with any MCP client that supports either protocol.
Existing API Integration: You have a weather API at api.weather.com and a local service at localhost:8090. Configure both in a single YAML file, and suddenly Cursor can call either service through MCP without you writing any new code.
Microservices Bridge: Your organization has dozens of HTTP microservices. Instead of updating each one for MCP support, deploy this proxy once and give your AI tools access to your entire service ecosystem.
Legacy System Modernization: Got older HTTP APIs that need to work with modern AI development tools? This proxy brings them into the MCP ecosystem without requiring any legacy code changes.
The configuration is straightforward - define your services and their upstream targets:
mcps:
- id: weather-service
upstream_id: 1
path: https://api.weather.com/openapi.json
- id: local-api
upstream_id: 2
path: config/local-api-spec.json
upstreams:
- id: 1
nodes:
"api.weather.com": 1
scheme: https
- id: 2
nodes:
"127.0.0.1:8090": 1
Run it with cargo run -- -c config.yaml and your services are immediately available to MCP clients at:
localhost:8080/sse (for SSE-based clients)localhost:8080/mcp (for Streamable HTTP clients)Each service gets its own endpoint while sharing the same proxy instance:
/api/weather-service/sse for just the weather API/api/local-api/mcp for just your local service/sse or /mcp for all services combinedThis means you can give different teams access to different services while managing everything through one proxy.
docker run -d \
-p 8080:8080 \
-v /path/to/config.yaml:/app/config/config.yaml \
ghcr.io/sxhxliang/mcp-access-point:main
Your HTTP services are now MCP-compatible and ready for Cursor, VS Code, Windsurf, or any other MCP client.
MCP is becoming the standard way AI development tools interact with external services. Instead of rebuilding your HTTP infrastructure, you can bridge it in minutes and immediately start using your existing APIs with AI-powered development tools.
The proxy approach means you keep your current architecture while gaining access to the growing MCP ecosystem - no technical debt, no rewrites, just immediate compatibility.