Bug type
Behavior bug (incorrect output/state without crash)
Beta release blocker
No
Summary
Version
OpenClaw 2026.4.2
Summary
A session does not reliably return to or stay on the configured primary model after fallback occurs. Even with openai-codex/gpt-5.4 configured as primary, the active session continues using a fallback model.
Minimal config
{
"agents": {
"defaults": {
"model": {
"primary": "openai-codex/gpt-5.4",
"fallbacks": [
"amazon-bedrock/minimax.minimax-m2.5",
"openrouter/openrouter/free"
]
}
},
"list": [
{
"id": "main",
"model": "openai-codex/gpt-5.4"
}
]
}
}
Observed examples
- Fallback model got stuck on
amazon-bedrock/minimax.minimax-m2.5
- After removing MiniMax from fallbacks, session used
openrouter/openrouter/free instead of primary
Impact
- Users cannot trust the configured primary model to remain active.
- Model selection in the UI/session appears inconsistent after failover.
Steps to reproduce
Steps to reproduce
- Configure
openai-codex/gpt-5.4 as the primary model.
- Configure one or more fallback models.
- Start a normal session in the Control UI.
- Cause the primary model to fail once so fallback activates.
- After fallback succeeds, check the current model in the UI or via session status.
- Attempt to switch back to
openai-codex/gpt-5.4.
Expected behavior
Expected behavior
- The session should use
openai-codex/gpt-5.4 by default.
- Fallback should only be used when the primary fails.
- After switching back, the current model should remain
openai-codex/gpt-5.4.
Actual behavior
Actual behavior
- The session keeps using the fallback model after fallback occurred.
- Explicit switching back to
openai-codex/gpt-5.4 appears inconsistent or temporary.
- Removing one fallback causes another fallback to be used instead of the primary.
OpenClaw version
2026.4.2
Operating system
MacOS 26.4
Install method
npm install
Model
openai-codex/gpt-5.4, minimax/m2.5, openrouter/free
Provider / routing chain
openclaw -> openai-codex/gpt-5.4 -> minimax/m2.5
Additional provider/model setup details
No response
Logs, screenshots, and evidence
Impact and severity
No response
Additional information
No response
Bug type
Behavior bug (incorrect output/state without crash)
Beta release blocker
No
Summary
Version
OpenClaw 2026.4.2
Summary
A session does not reliably return to or stay on the configured primary model after fallback occurs. Even with
openai-codex/gpt-5.4configured as primary, the active session continues using a fallback model.Minimal config
{ "agents": { "defaults": { "model": { "primary": "openai-codex/gpt-5.4", "fallbacks": [ "amazon-bedrock/minimax.minimax-m2.5", "openrouter/openrouter/free" ] } }, "list": [ { "id": "main", "model": "openai-codex/gpt-5.4" } ] } }Observed examples
amazon-bedrock/minimax.minimax-m2.5openrouter/openrouter/freeinstead of primaryImpact
Steps to reproduce
Steps to reproduce
openai-codex/gpt-5.4as the primary model.openai-codex/gpt-5.4.Expected behavior
Expected behavior
openai-codex/gpt-5.4by default.openai-codex/gpt-5.4.Actual behavior
Actual behavior
openai-codex/gpt-5.4appears inconsistent or temporary.OpenClaw version
2026.4.2
Operating system
MacOS 26.4
Install method
npm install
Model
openai-codex/gpt-5.4, minimax/m2.5, openrouter/free
Provider / routing chain
openclaw -> openai-codex/gpt-5.4 -> minimax/m2.5
Additional provider/model setup details
No response
Logs, screenshots, and evidence
Impact and severity
No response
Additional information
No response