Bug type
Regression (worked before, now fails)
Summary
Bug Description
The OpenClaw Control UI incorrectly handles provider prefixes when attempting to switch to local models (e.g., Ollama) via the model dropdown, resulting in a GatewayRequestError: model not allowed exception.
Steps to reproduce
Steps to Reproduce
- Configure
openclaw.json with an Ollama auth profile and register a local model in agents.defaults.models (e.g., ollama/qwen2.5:32b or ollama/huihui_ai/qwen2.5-abliterate:7b-instruct).
- Open the OpenClaw Control UI.
- Attempt to switch the active model using the UI dropdown to one of the local Ollama models.
Expected behavior
Expected Behavior
The UI should send the exact model string (including the ollama/ prefix) to the Gateway, and the model switch should succeed.
Actual behavior
Actual Behavior
The UI modifies or strips the provider prefix. This results in the Gateway routing the request to the default provider (Google) or failing the allowlist check.
Error examples observed in the UI:
Failed to set model: GatewayRequestError: model not allowed: google/qwen2.5:32b (UI replaced ollama/ with google/)
Failed to set model: GatewayRequestError: model not allowed: huihui_ai/qwen2.5-abliterate:7b-instruct (UI stripped the ollama/ prefix)
OpenClaw version
v2026.3.13
Operating system
Windows 11
Install method
npm global
Model
ollama/huihui_ai/qwen2.5-abliterate:7b-instruct
Provider / routing chain
openclaw -> local ollama (http://127.0.0.1:11434)
Additional provider/model setup details
Additional Context
Bypassing the UI and setting the model directly via the backend (session_status tool) works perfectly, confirming that the gateway routing and local configuration are valid. The bug is isolated to how the Control UI frontend constructs the model change request.
Logs, screenshots, and evidence
Impact and severity
No response
Additional information
No response
Bug type
Regression (worked before, now fails)
Summary
Bug Description
The OpenClaw Control UI incorrectly handles provider prefixes when attempting to switch to local models (e.g., Ollama) via the model dropdown, resulting in a
GatewayRequestError: model not allowedexception.Steps to reproduce
Steps to Reproduce
openclaw.jsonwith an Ollama auth profile and register a local model inagents.defaults.models(e.g.,ollama/qwen2.5:32borollama/huihui_ai/qwen2.5-abliterate:7b-instruct).Expected behavior
Expected Behavior
The UI should send the exact model string (including the
ollama/prefix) to the Gateway, and the model switch should succeed.Actual behavior
Actual Behavior
The UI modifies or strips the provider prefix. This results in the Gateway routing the request to the default provider (Google) or failing the allowlist check.
Error examples observed in the UI:
Failed to set model: GatewayRequestError: model not allowed: google/qwen2.5:32b(UI replacedollama/withgoogle/)Failed to set model: GatewayRequestError: model not allowed: huihui_ai/qwen2.5-abliterate:7b-instruct(UI stripped theollama/prefix)OpenClaw version
v2026.3.13
Operating system
Windows 11
Install method
npm global
Model
ollama/huihui_ai/qwen2.5-abliterate:7b-instruct
Provider / routing chain
openclaw -> local ollama (http://127.0.0.1:11434)
Additional provider/model setup details
Additional Context
Bypassing the UI and setting the model directly via the backend (
session_statustool) works perfectly, confirming that the gateway routing and local configuration are valid. The bug is isolated to how the Control UI frontend constructs the model change request.Logs, screenshots, and evidence
Impact and severity
No response
Additional information
No response