Skip to content

Regression: Codex ChatGPT accounts hard-fail when session switches to openai-codex/gpt-5.4-mini #74451

@0xCyda

Description

@0xCyda

Summary

Regression: this worked in 2026.4.23, but on 2026.4.26 an OpenClaw session running on the Codex ChatGPT backend hard-fails when it switches to openai-codex/gpt-5.4-mini.

Regression

  • Worked fine in: 2026.4.23
  • Broken in: 2026.4.26

Config note

There were no intentional changes to the models config related to this failure.

Current config still has:

  • primary model: openai-codex/gpt-5.4
  • openai-codex/gpt-5.4-mini present in the configured OpenAI Codex model list

openclaw.json was written recently, but only for unrelated reasons:

  • 2026-04-28: openclaw mcp set trendtrack ...
  • 2026-04-29: openclaw doctor --non-interactive --fix

So this does not look like it was caused by a manual models-config change.

What happened

  • Session was running normally on openai-codex/gpt-5.4
  • Runtime switched the session to openai-codex/gpt-5.4-mini
  • Multiple assistant turns then failed with:
    • The 'openai-codex/gpt-5.4-mini' model is not supported when using Codex with a ChatGPT account.
  • Runtime later switched the session back to openai-codex/gpt-5.4

Expected

One of these should happen instead:

  1. OpenClaw should know gpt-5.4-mini is unsupported for ChatGPT-backed Codex accounts and block the switch before it is applied
  2. It should fall back automatically to a supported model like openai-codex/gpt-5.4
  3. At minimum, the model-change path should fail safe without causing repeated assistant-turn failures

Actual

The session accepted the unsupported model selection, then assistant turns failed repeatedly at runtime.

Environment

  • Worked in OpenClaw: 2026.4.23
  • Broken in OpenClaw: 2026.4.26
  • Backend/provider: openai-codex
  • API adapter: openai-codex-responses
  • Account type: ChatGPT-backed Codex account
  • Surface: Discord

Evidence

Exported log slice:

  • /home/brandon/.openclaw/workspace/exports/openclaw-gpt-5.4-mini-chatgpt-account-issue-2026-04-29.md

Relevant transcript entries:

{"type":"custom","customType":"model-snapshot","data":{"timestamp":1777474052260,"provider":"openai-codex","modelApi":"openai-codex-responses","modelId":"gpt-5.4-mini"},"timestamp":"2026-04-29T14:47:32.260Z"}
{"type":"message","timestamp":"2026-04-29T14:47:40.885Z","message":{"role":"assistant","api":"openai-codex-responses","provider":"openai-codex","model":"openai-codex/gpt-5.4-mini","stopReason":"error","errorMessage":"The model is not supported when using Codex with a ChatGPT account."}}
{"type":"message","timestamp":"2026-04-29T14:54:25.728Z","message":{"role":"assistant","api":"openai-codex-responses","provider":"openai-codex","model":"openai-codex/gpt-5.4-mini","stopReason":"error","errorMessage":"The model is not supported when using Codex with a ChatGPT account."}}
{"type":"message","timestamp":"2026-04-29T14:59:43.482Z","message":{"role":"assistant","api":"openai-codex-responses","provider":"openai-codex","model":"openai-codex/gpt-5.4-mini","stopReason":"error","errorMessage":"The model is not supported when using Codex with a ChatGPT account."}}

Repro idea

  1. Use OpenClaw with a ChatGPT-backed Codex account
  2. Switch the active model for a session to openai-codex/gpt-5.4-mini
  3. Send a normal user message
  4. Observe the assistant turn fail instead of falling back or rejecting the switch earlier

Suggested fix

Add capability gating for account-backed Codex models so unsupported models like openai-codex/gpt-5.4-mini cannot be selected for incompatible account types.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions