Node.js MCP server that wraps the `w3` CLI to let language-models and other MCP clients manage data/spaces on storacha.network (IPFS).
https://github.com/alexbakers/mcp-ipfsStop wrestling with traditional cloud storage APIs when your AI applications need persistent, decentralized data management. This MCP server gives your language models and AI agents direct access to IPFS through storacha.network, wrapping the powerful w3 CLI into clean MCP tools.
Your AI applications generate and consume massive amounts of data—context files, training data, model outputs, user sessions. Traditional storage solutions lock you into vendor ecosystems and geographic regions. This MCP server changes that by letting your AI agents manage decentralized storage directly through IPFS.
No more context switching between your AI workflow and storage management. Your language models can now upload files, manage storage spaces, create sharing links, and handle delegations—all through familiar MCP tool calls.
Direct AI-to-IPFS Integration: Your Claude, GPT, or custom language models can upload files, create spaces, and manage IPFS content without you writing storage abstraction layers.
Persistent AI Context: Store conversation histories, RAG databases, and AI-generated content on IPFS with permanent, content-addressed URLs that survive server migrations.
Decentralized AI Workflows: Build AI applications that aren't tied to specific cloud providers—your data lives on IPFS and your AI can access it from anywhere.
Automated Storage Management: Let your AI agents handle their own storage needs—creating spaces when needed, cleaning up old data, managing access permissions through delegations.
AI Documentation Systems: Your AI generates comprehensive documentation, uploads it to IPFS via w3_up, and shares permanent w3s.link URLs that never break, even if your infrastructure changes.
RAG Pipeline Storage: Store your vector embeddings and source documents on IPFS. Your AI can manage the entire pipeline—uploading new documents with w3_up, listing existing content with w3_ls, and cleaning up outdated data with w3_rm.
Multi-Agent Collaboration: Different AI agents can share data through IPFS spaces. One agent uploads research data, another processes it, and a third generates reports—all coordinated through MCP tool calls to the same decentralized storage.
AI Training Data Management: Your AI can organize and version training datasets on IPFS, create delegations for team access, and generate usage reports for billing—all programmatically.
Add this to your MCP client configuration and you're running:
{
"mcpServers": {
"ipfs": {
"command": "npx",
"args": ["-y", "mcp-ipfs"],
"env": {
"W3_LOGIN_EMAIL": "[email protected]"
}
}
}
}
Your AI immediately gains access to 20+ IPFS operations—from basic file management to advanced delegation handling. The server wraps the battle-tested w3 CLI, so you get the full power of storacha.network without learning new APIs.
This isn't just file upload/download. Your AI can:
The comprehensive tool set covers everything from authentication (w3_login) to advanced storage operations (w3_can_* tools), giving your AI applications enterprise-grade decentralized storage capabilities.
Built with proper TypeScript foundations, Zod schema validation, and comprehensive error handling. Available as NPM package or Docker container. The modular architecture in src/ separates tool handlers, schemas, and utilities for easy customization and extension.
When your AI applications need storage that travels with your data rather than locking you to a provider, this MCP server delivers the IPFS integration you've been looking for.