A Mattermost integration that connects to Model Context Protocol (MCP) servers, leveraging a LangGraph-based AI agent.
https://github.com/jagan-shanmugam/mattermost-mcp-hostStop switching between chat, terminals, and web interfaces. This MCP host transforms your team's Mattermost channels into intelligent workspaces where you can execute any MCP tool through natural conversation.
Your team's productivity gets fragmented across dozens of tools. Someone asks about a GitHub issue in chat, you context-switch to GitHub, search, copy-paste links back. Need to check server status? Jump to another terminal. Want to search documentation? Open another browser tab.
Meanwhile, your chat becomes a graveyard of "let me check that and get back to you" messages.
This integration brings a LangGraph-powered AI agent directly into your Mattermost channels. The agent doesn't just chat—it dynamically discovers and executes tools from any MCP server you connect.
The magic happens in real conversations:
Thread-Aware Intelligence: The agent maintains full context within Mattermost threads. Ask follow-up questions, refine searches, or chain multiple actions without repeating context.
Smart Tool Orchestration: Instead of manually running individual commands, describe what you need. The agent figures out which tools to call and in what sequence. It might search existing GitHub issues first, then create a new one if nothing relevant exists.
Zero Context Switching: Everything happens in your existing chat workflow. No new interfaces to learn, no breaking conversation flow to run commands elsewhere.
Team Knowledge Amplification: Junior developers get the same access to complex tooling as seniors. The agent handles the syntax, parameters, and API calls—team members just describe their needs in plain English.
The demos show this in action:
Sometimes you want direct tool access. Use the # prefix for immediate control:
#github call search_issues {"query": "timeout", "state": "open"}#servers to list all connected MCP servers#github tools to see available GitHub operationsDynamic Tool Discovery: Connect any MCP server through the JSON config. The agent automatically discovers available tools and makes them accessible through conversation.
Multiple LLM Support: Works with Azure OpenAI, OpenAI, Anthropic Claude, or Google Gemini. Configure once, works everywhere.
Existing Workflow Friendly:
mattermost-mcp-hostSupport Channels: Agents can search knowledge bases, check system status, and escalate to appropriate tools based on issue descriptions.
Code Review Discussions: "Check if there are similar PRs" or "Create a follow-up issue for the performance concern mentioned" happen inline with code discussions.
Incident Response: Combine monitoring tools, documentation searches, and issue tracking in a single conversational interface during critical incidents.
Onboarding: New team members get instant access to institutional knowledge and tooling through natural language, not learning dozens of CLI tools.
This isn't about replacing your tools—it's about making them accessible where your team already collaborates. Turn your Mattermost channels into intelligent workspaces that get things done.