Summary
On OpenClaw 2026.3.2, openai-codex/gpt-5.4 can be added to config and appears in models list, but it is still treated as missing and fails at runtime with Unknown model / HTTP 404.
This makes it look like GPT-5.4 via openai-codex is supported when it is not yet wired through the runtime.
Environment
- OpenClaw version:
2026.3.2
- Install method:
pnpm
- Auth:
openai-codex OAuth profile present and valid
- Platform: macOS
Reproduction
- Configure the main agent or default model as:
openai-codex/gpt-5.4
- or
openai-codex/gpt-5.4-codex
- Restart gateway/node.
- Run:
openclaw models list
openclaw models status --json
- start a fresh
main session / normal agent turn
Actual behavior
openclaw models list shows:
openai-codex/gpt-5.4 ... configured,missing
openai-codex/gpt-5.4-codex ... configured,missing
- runtime errors include:
FailoverError: Unknown model: openai-codex/gpt-5.4
FailoverError: HTTP 404: 404 page not found
Expected behavior
One of these should be true:
openai-codex/gpt-5.4 is fully supported in runtime/catalog/forward-compat and works.
- Or OpenClaw should reject it early and clearly, instead of allowing it into config and then failing later at runtime.
Findings
It looks deeper than an allow-list issue.
The installed runtime still appears hard-wired around gpt-5.3-codex for openai-codex:
dist/model-picker-CGU6hX_z.js
OPENAI_CODEX_DEFAULT_MODEL = "openai-codex/gpt-5.3-codex"
dist/model-ZurrFOi9.js
- Codex forward-compat/fallback logic is centered on
gpt-5.3-codex
dist/model-catalog-qZGHxvcI.js
- no equivalent generic handling for
gpt-5.4
So the config layer accepts the model, but runtime/catalog resolution does not fully support it.
Suggested fix
- Add proper runtime/catalog/forward-compat support for
openai-codex/gpt-5.4 (and possibly gpt-5.4-codex if that is the intended canonical id), or
- fail validation early with a clear error if
openai-codex/gpt-5.4 is not supported yet.
Notes
I verified that falling back to openai-codex/gpt-5.3-codex restores normal operation, which makes this look specifically like missing GPT-5.4 integration rather than broken OAuth/auth.
Summary
On OpenClaw
2026.3.2,openai-codex/gpt-5.4can be added to config and appears inmodels list, but it is still treated asmissingand fails at runtime withUnknown model/HTTP 404.This makes it look like GPT-5.4 via
openai-codexis supported when it is not yet wired through the runtime.Environment
2026.3.2pnpmopenai-codexOAuth profile present and validReproduction
openai-codex/gpt-5.4openai-codex/gpt-5.4-codexopenclaw models listopenclaw models status --jsonmainsession / normal agent turnActual behavior
openclaw models listshows:openai-codex/gpt-5.4 ... configured,missingopenai-codex/gpt-5.4-codex ... configured,missingFailoverError: Unknown model: openai-codex/gpt-5.4FailoverError: HTTP 404: 404 page not foundExpected behavior
One of these should be true:
openai-codex/gpt-5.4is fully supported in runtime/catalog/forward-compat and works.Findings
It looks deeper than an allow-list issue.
The installed runtime still appears hard-wired around
gpt-5.3-codexforopenai-codex:dist/model-picker-CGU6hX_z.jsOPENAI_CODEX_DEFAULT_MODEL = "openai-codex/gpt-5.3-codex"dist/model-ZurrFOi9.jsgpt-5.3-codexdist/model-catalog-qZGHxvcI.jsgpt-5.4So the config layer accepts the model, but runtime/catalog resolution does not fully support it.
Suggested fix
openai-codex/gpt-5.4(and possiblygpt-5.4-codexif that is the intended canonical id), oropenai-codex/gpt-5.4is not supported yet.Notes
I verified that falling back to
openai-codex/gpt-5.3-codexrestores normal operation, which makes this look specifically like missing GPT-5.4 integration rather than broken OAuth/auth.