AxonHub AI Gateway provider plugin for OpenClaw — Route requests to 100+ LLM providers through a unified API gateway.
- Dynamic model discovery from your AxonHub instance
- Custom base URL for self-hosted AxonHub deployments
- OpenAI-compatible API transport (
openai-completions) - xhigh / max reasoning support for capable model families (OpenAI gpt-5.x / o3 / o4-mini, Anthropic Claude opus-4.7 / sonnet-4.7, DeepSeek V4, Google Gemini 3.x); other reasoning models fall back to OpenClaw's built-in graceful effort downgrade so xhigh requests don't hit upstream 400s
- Forward-compat: if AxonHub later exposes per-model reasoning effort lists in
capabilities.reasoning_efforts(or alias keys), the plugin reads them preferentially over the built-in family table - Automatic model metadata: context window, pricing, capabilities
openclaw plugins install @ruelya/axonhub-openclaw-pluginOr explicitly from ClawHub:
openclaw plugins install clawhub:@ruelya/axonhub-openclaw-pluginRestart the gateway after installing.
Run onboarding to configure AxonHub:
openclaw onboard --axonhub-api-key <your-api-key>| Setting | Description |
|---|---|
| API Key | Your AxonHub API key |
| Base URL | AxonHub instance URL (default: http://localhost:8090) |
MIT