Bug type
Regression (worked before, now fails)
Beta release blocker
No
Summary
Body:
OpenClaw 2026.4.26 (be8c246)
After updating to .26, isolated cron jobs using `openai-codex/gpt-5.4-mini` started failing.
`openclaw models list --provider openai-codex` shows:
- openai-codex/gpt-5.4
- openai-codex/gpt-5.5
- openai-codex/gpt-5.4-mini
`gpt-5.4-mini` is shown as configured/authenticated, but the cron run fails with:
{"detail":"The 'openai-codex/gpt-5.4-mini' model is not supported when using Codex with a ChatGPT account."}
Example cron job:
- id: 2c72759b-e54a-43a2-bad6-73cbf1deaae3
- name: Polymarket Weather Scan
- session: isolated
- model: openai-codex/gpt-5.4-mini
Expected:
Either the model should work, or `models list` / cron validation should not present it as usable for Codex ChatGPT OAuth.
Actual:
Cron resolves the configured model, starts the run, then fails at runtime with upstream unsupported-model error.
Worked in 23.4.2026, broke in 26.4.2026
Steps to reproduce
On 26.4.2026 - Try use openai-codex/gpt-5.4-mini and it will throw that error
Expected behavior
Read above
Actual behavior
Read above
OpenClaw version
2026.4.26
Operating system
Ubuntu
Install method
No response
Model
Openai-codex/gpt-5.4-mini
Provider / routing chain
Openclaw
Additional provider/model setup details
No response
Logs, screenshots, and evidence
Impact and severity
No response
Additional information
No response
Bug type
Regression (worked before, now fails)
Beta release blocker
No
Summary
Body:
Worked in 23.4.2026, broke in 26.4.2026
Steps to reproduce
On 26.4.2026 - Try use openai-codex/gpt-5.4-mini and it will throw that error
Expected behavior
Read above
Actual behavior
Read above
OpenClaw version
2026.4.26
Operating system
Ubuntu
Install method
No response
Model
Openai-codex/gpt-5.4-mini
Provider / routing chain
Openclaw
Additional provider/model setup details
No response
Logs, screenshots, and evidence
Impact and severity
No response
Additional information
No response