MCP server that plugs Macrocosmos Subnet (SN13/SN1) data and Hugging Face search into Claude Desktop or Cursor.
https://github.com/macrocosm-os/macrocosmos-mcpStop switching between tabs to research trends, check social sentiment, or hunt for the right Hugging Face models. The Macrocosmos MCP server brings live social data from X and Reddit, plus comprehensive ML model discovery, directly into Claude Desktop and Cursor.
You're deep in a project when you need to understand user sentiment about a feature, find similar implementations on social platforms, or locate a specific model for your use case. Instead of breaking flow to browse Twitter, Reddit, and Hugging Face separately, query everything from your AI environment.
Real scenario: Building a sentiment analysis feature? Ask Claude to pull recent Twitter discussions about your topic, analyze the sentiment patterns, then recommend relevant models from Hugging Face - all without leaving your editor.
Live Social Intelligence
ML Model Discovery
Product Research: "Show me recent Twitter discussions about GraphQL performance issues and suggest models that could help with query optimization."
Competitive Intelligence: "What are developers saying on Reddit about React server components, and what models exist for code analysis in this space?"
Technical Decision Making: "Find social discussions about vector database choices and recommend embedding models from Hugging Face that work well with each option."
Content Strategy: "Pull trending AI topics from social platforms and suggest datasets I could use to build relevant examples."
The setup respects your existing workflow - no complex infrastructure changes needed:
git clone https://github.com/macrocosm-os/macrocosmos-mcp.git
cd macrocosmos-mcp/src
uv venv && source .venv/bin/activate
uv add "mcp[cli]" httpx macrocosmos
Add the server to your Claude or Cursor config, and you're pulling social insights and ML resources without breaking focus.
Social platforms contain unfiltered developer opinions, real user feedback, and emerging trends that influence technical decisions. Hugging Face hosts the models you'll actually use. Having both accessible through natural language queries in your AI environment eliminates the research friction that slows down informed development.
The Macrocosmos MCP server turns your AI assistant into a research powerhouse that understands both the social context around your work and the technical resources available to implement it.