Summary
Regression: this worked in 2026.4.23, but on 2026.4.26 an OpenClaw session running on the Codex ChatGPT backend hard-fails when it switches to openai-codex/gpt-5.4-mini.
Regression
- Worked fine in:
2026.4.23
- Broken in:
2026.4.26
Config note
There were no intentional changes to the models config related to this failure.
Current config still has:
- primary model:
openai-codex/gpt-5.4
openai-codex/gpt-5.4-mini present in the configured OpenAI Codex model list
openclaw.json was written recently, but only for unrelated reasons:
2026-04-28: openclaw mcp set trendtrack ...
2026-04-29: openclaw doctor --non-interactive --fix
So this does not look like it was caused by a manual models-config change.
What happened
- Session was running normally on
openai-codex/gpt-5.4
- Runtime switched the session to
openai-codex/gpt-5.4-mini
- Multiple assistant turns then failed with:
The 'openai-codex/gpt-5.4-mini' model is not supported when using Codex with a ChatGPT account.
- Runtime later switched the session back to
openai-codex/gpt-5.4
Expected
One of these should happen instead:
- OpenClaw should know
gpt-5.4-mini is unsupported for ChatGPT-backed Codex accounts and block the switch before it is applied
- It should fall back automatically to a supported model like
openai-codex/gpt-5.4
- At minimum, the model-change path should fail safe without causing repeated assistant-turn failures
Actual
The session accepted the unsupported model selection, then assistant turns failed repeatedly at runtime.
Environment
- Worked in OpenClaw:
2026.4.23
- Broken in OpenClaw:
2026.4.26
- Backend/provider:
openai-codex
- API adapter:
openai-codex-responses
- Account type: ChatGPT-backed Codex account
- Surface: Discord
Evidence
Exported log slice:
/home/brandon/.openclaw/workspace/exports/openclaw-gpt-5.4-mini-chatgpt-account-issue-2026-04-29.md
Relevant transcript entries:
{"type":"custom","customType":"model-snapshot","data":{"timestamp":1777474052260,"provider":"openai-codex","modelApi":"openai-codex-responses","modelId":"gpt-5.4-mini"},"timestamp":"2026-04-29T14:47:32.260Z"}
{"type":"message","timestamp":"2026-04-29T14:47:40.885Z","message":{"role":"assistant","api":"openai-codex-responses","provider":"openai-codex","model":"openai-codex/gpt-5.4-mini","stopReason":"error","errorMessage":"The model is not supported when using Codex with a ChatGPT account."}}
{"type":"message","timestamp":"2026-04-29T14:54:25.728Z","message":{"role":"assistant","api":"openai-codex-responses","provider":"openai-codex","model":"openai-codex/gpt-5.4-mini","stopReason":"error","errorMessage":"The model is not supported when using Codex with a ChatGPT account."}}
{"type":"message","timestamp":"2026-04-29T14:59:43.482Z","message":{"role":"assistant","api":"openai-codex-responses","provider":"openai-codex","model":"openai-codex/gpt-5.4-mini","stopReason":"error","errorMessage":"The model is not supported when using Codex with a ChatGPT account."}}
Repro idea
- Use OpenClaw with a ChatGPT-backed Codex account
- Switch the active model for a session to
openai-codex/gpt-5.4-mini
- Send a normal user message
- Observe the assistant turn fail instead of falling back or rejecting the switch earlier
Suggested fix
Add capability gating for account-backed Codex models so unsupported models like openai-codex/gpt-5.4-mini cannot be selected for incompatible account types.
Summary
Regression: this worked in
2026.4.23, but on2026.4.26an OpenClaw session running on the Codex ChatGPT backend hard-fails when it switches toopenai-codex/gpt-5.4-mini.Regression
2026.4.232026.4.26Config note
There were no intentional changes to the models config related to this failure.
Current config still has:
openai-codex/gpt-5.4openai-codex/gpt-5.4-minipresent in the configured OpenAI Codex model listopenclaw.jsonwas written recently, but only for unrelated reasons:2026-04-28:openclaw mcp set trendtrack ...2026-04-29:openclaw doctor --non-interactive --fixSo this does not look like it was caused by a manual models-config change.
What happened
openai-codex/gpt-5.4openai-codex/gpt-5.4-miniThe 'openai-codex/gpt-5.4-mini' model is not supported when using Codex with a ChatGPT account.openai-codex/gpt-5.4Expected
One of these should happen instead:
gpt-5.4-miniis unsupported for ChatGPT-backed Codex accounts and block the switch before it is appliedopenai-codex/gpt-5.4Actual
The session accepted the unsupported model selection, then assistant turns failed repeatedly at runtime.
Environment
2026.4.232026.4.26openai-codexopenai-codex-responsesEvidence
Exported log slice:
/home/brandon/.openclaw/workspace/exports/openclaw-gpt-5.4-mini-chatgpt-account-issue-2026-04-29.mdRelevant transcript entries:
{"type":"custom","customType":"model-snapshot","data":{"timestamp":1777474052260,"provider":"openai-codex","modelApi":"openai-codex-responses","modelId":"gpt-5.4-mini"},"timestamp":"2026-04-29T14:47:32.260Z"} {"type":"message","timestamp":"2026-04-29T14:47:40.885Z","message":{"role":"assistant","api":"openai-codex-responses","provider":"openai-codex","model":"openai-codex/gpt-5.4-mini","stopReason":"error","errorMessage":"The model is not supported when using Codex with a ChatGPT account."}} {"type":"message","timestamp":"2026-04-29T14:54:25.728Z","message":{"role":"assistant","api":"openai-codex-responses","provider":"openai-codex","model":"openai-codex/gpt-5.4-mini","stopReason":"error","errorMessage":"The model is not supported when using Codex with a ChatGPT account."}} {"type":"message","timestamp":"2026-04-29T14:59:43.482Z","message":{"role":"assistant","api":"openai-codex-responses","provider":"openai-codex","model":"openai-codex/gpt-5.4-mini","stopReason":"error","errorMessage":"The model is not supported when using Codex with a ChatGPT account."}}Repro idea
openai-codex/gpt-5.4-miniSuggested fix
Add capability gating for account-backed Codex models so unsupported models like
openai-codex/gpt-5.4-minicannot be selected for incompatible account types.