
Best AI Gateway for WHMCS: 6 Options Tested and Compared (2026)
We tested OpenRouter, LiteLLM, Portkey, Bifrost, Kong, and Cloudflare AI Gateway with WHMCS. Which one supports MCP, self-hosting, and keeps your billing data private?
Connect your WHMCS to the AI tools you already use. MX Modules works with Claude, Cursor, ChatGPT, and local AI models through the MCP protocol.
Each MX module connects to WHMCS. MCP Server extends that connection to AI clients.
Platform
The foundation. All MX modules install directly into your WHMCS admin panel. Supports WHMCS 8.0 and newer.
Works with:
AI Client
Native MCP support built by Anthropic, the creators of MCP protocol. The recommended client for MCP Server.
Works with:
AI Client
AI-powered code editor with native MCP support. Use it to query WHMCS data while developing modules or customizations.
Works with:
AI Client
No native MCP support. Can connect to WHMCS via third-party wrappers or the WHMCS API directly.
Works with:
AI Client
Anthropic's CLI agent for developers. Native MCP support. Query WHMCS from your terminal while coding.
Works with:
AI Client
OpenAI's desktop agent with MCP support. Configure via TOML to connect to your WHMCS server.
Works with:
AI Client
AI-powered IDE by Codeium with native MCP support. Configure like Cursor to query WHMCS while developing.
Works with:
AI Client
Open-source IDE extension for VS Code and JetBrains with native MCP support.
Works with:
AI Client
Run AI models on your own hardware for maximum privacy. Connect to WHMCS via MCP clients that support local models.
Works with:
MCP (Model Context Protocol) is the open standard that makes AI + WHMCS work.
Upload the module to your WHMCS. Activate it from Addons. Generate an API token. Takes 5 minutes.
Add one JSON block to Claude Desktop, Cursor, or any MCP-compatible client. Point it to your WHMCS server.
Ask your AI about clients, invoices, tickets, and services. Get real answers from your WHMCS data.
For step-by-step instructions, see the MCP setup guide or learn what the MCP protocol is.
Which MX modules connect to which tools.
| Integration | MCP Server | MX Metrics | Proposals |
|---|---|---|---|
| WHMCS | |||
| Claude Desktop | |||
| Claude Code | |||
| Cursor | |||
| Codex Desktop | |||
| Windsurf | |||
| Continue | |||
| ChatGPT (via wrappers) | Indirect | ||
| Local Models (Ollama) |
MX Metrics and Proposals are self-contained WHMCS modules. They do not require AI integration. MCP Server is the bridge between WHMCS and AI tools.
Learn more about this topic from our blog

We tested OpenRouter, LiteLLM, Portkey, Bifrost, Kong, and Cloudflare AI Gateway with WHMCS. Which one supports MCP, self-hosting, and keeps your billing data private?

Use MX Proposals webhooks to trigger Slack notifications, n8n workflows, and CRM updates when clients view, sign, or pay proposals in WHMCS.

LiteLLM + MCP Server for WHMCS: self-host your AI, route 100+ models, and keep billing data on your servers. Native MCP Gateway support.
MCP Server gives your AI assistant direct access to clients, invoices, tickets, and services. 45 tools, one API token, 5-minute setup.