Bug type
Behavior bug (incorrect output/state without crash)
Summary
When using the model picker dropdown in the Control UI to switch to an Ollama model (e.g. ollama/gpt-oss:120b-cloud), the UI submits the model incorrectly, producing: model not allowed: anthropic/gpt-oss:120b-cloud.
The dropdown shows the full model ID with provider prefix correctly, but on submission the UI strips ollama/ and prepends anthropic/ as the default provider.
Steps to reproduce
- Open Control UI → model picker
- Select any Ollama model (e.g. ollama/gpt-oss:120b-cloud)
- Confirm → error: Failed to set model: model not allowed: anthropic/gpt-oss:120b-cloud
Expected behavior
Model is switch successfully to ollama/gpt-oss:120b-cloud
Actual behavior
error: Failed to set model: model not allowed: anthropic/gpt-oss:120b-cloud
OpenClaw version
2026.3.13
Operating system
PopOS 24.04
Install method
npm global
Model
gpt-oss:120b-cloud
Provider / routing chain
openclaw -> ollama
Additional provider/model setup details
No response
Logs, screenshots, and evidence
Impact and severity
Severity: low
Workaround: Use /model in chat or openclaw models set via CLI.
Additional information
No response
Bug type
Behavior bug (incorrect output/state without crash)
Summary
When using the model picker dropdown in the Control UI to switch to an Ollama model (e.g. ollama/gpt-oss:120b-cloud), the UI submits the model incorrectly, producing: model not allowed: anthropic/gpt-oss:120b-cloud.
The dropdown shows the full model ID with provider prefix correctly, but on submission the UI strips ollama/ and prepends anthropic/ as the default provider.
Steps to reproduce
Expected behavior
Model is switch successfully to ollama/gpt-oss:120b-cloud
Actual behavior
error: Failed to set model: model not allowed: anthropic/gpt-oss:120b-cloud
OpenClaw version
2026.3.13
Operating system
PopOS 24.04
Install method
npm global
Model
gpt-oss:120b-cloud
Provider / routing chain
openclaw -> ollama
Additional provider/model setup details
No response
Logs, screenshots, and evidence
Impact and severity
Severity: low
Workaround: Use /model in chat or openclaw models set via CLI.
Additional information
No response