Bug type
Regression (worked before, now fails)
Summary
GPT-5.4 via openai-codex OAuth 401s and silently falls back to gpt-5.3-codex
Steps to reproduce
- Install OpenClaw
2026.3.2 and configure the openai-codex provider.
- Authenticate via ChatGPT / Codex OAuth (
openclaw auth openai-codex) using a ChatGPT Plus account that can use GPT-5.4 in the Codex CLI.
- In
openclaw.json / Web UI, set the default model to openai-codex/gpt-5.4 with fallback openai-codex/gpt-5.3-codex.
- Start a new Discord or Telegram session using the main agent.
- Send a few messages and inspect:
- Gateway logs for calls to GPT-5.4.
- Transcript
message.model fields for assistant messages.
- Web UI Usage page for model usage.
Expected behavior
- Calls to
openai-codex/gpt-5.4 via ChatGPT / Codex OAuth should succeed without requiring api.responses.write.
- Assistant
message.model should be gpt-5.4 (not gpt-5.3-codex) as long as 5.4 is configured as the primary model.
- Web UI Usage should show GPT-5.4 usage increasing during the session.
- Behavior should match Codex CLI, which can use GPT-5.4 successfully with the same account.
Actual behavior
-
openclaw status shows openai-codex/gpt-5.4 (1050k ctx) as default, fallback openai-codex/gpt-5.3-codex.
-
When the agent tries to use GPT-5.4, gateway calls fail with:
401 Unauthorized
Missing scopes: api.responses.write
Runtime then silently falls back to gpt-5.3-codex for responses.
Transcript shows:
Attempts with GPT-5.4 have stopReason: error.
Successful assistant messages all have message.model = gpt-5.3-codex.
Meanwhile, the Codex CLI (same account, same time) can use GPT-5.4 normally.
OpenClaw version
2026.3.2
Operating system
Windows 11 + WSL2 (Ubuntu)
Install method
npm install -g openclaw (global install inside WSL2)
Logs, screenshots, and evidence
- Gateway log snippet (redacted):
POST https://api.openai.com/v1/responses
401 Unauthorized
Missing scopes: api.responses.write
Transcript sample:
GPT-5.4 attempts → stopReason: "error", model gpt-5.4.
Fallback responses → stopReason: "stop", model gpt-5.3-codex.
Web UI Usage screenshot showing both gpt-5.3-codex and gpt-5.4 entries.
Impact and severity
- Affected users: anyone trying to use GPT-5.4 via ChatGPT / Codex OAuth (without an OpenAI API key).
- Severity: medium–high. The UI claims GPT-5.4 is the active model, but actual responses come from
gpt-5.3-codex, which can mislead users and make debugging harder.
- Frequency: always reproducible in my environment with 2026.3.2 and Codex OAuth.
- Consequence: users pay for ChatGPT Plus / Codex expecting GPT-5.4, but their OpenClaw agents actually run on 5.3-codex unless they notice the 401 + fallback behavior.
Additional information
Root cause (my investigation)
This sends GPT-5.4 requests to /v1/responses, which requires the api.responses.write scope.
ChatGPT / Codex OAuth tokens used by OpenClaw do not include api.responses.write, so the API returns 401.
In contrast, the Codex CLI uses https://chatgpt.com/backend-api/codex/responses with the same ChatGPT Plus token and works fine.
Workaround that fixes it for me
I was able to make GPT-5.4 work via Codex OAuth by overriding the openai-codex provider in ~/.openclaw/agents/main/agent/models.json:
"openai-codex": {
"baseUrl": "https://chatgpt.com/backend-api",
"api": "openai-codex-responses",
"models": [
{
"id": "gpt-5.4",
"name": "GPT-5.4",
"api": "openai-codex-responses",
"reasoning": true,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 1050000,
"maxTokens": 128000
},
{
"id": "gpt-5.3-codex",
"name": "GPT-5.3 Codex",
"api": "openai-codex-responses",
"reasoning": true,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 272000,
"maxTokens": 128000
},
{
"id": "gpt-5.1-codex-mini",
"name": "GPT-5.1 Codex Mini",
"api": "openai-codex-responses",
"reasoning": true,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 262144,
"maxTokens": 128000
}
]
}
⚠️ Wrong Fix: Do NOT modify the OAuth Scope
You might instinctively try to add api.responses.write to the OAuth scope in openai-codex.js:
DO NOT DO THIS!
- const SCOPE = "openid profile email offline_access";
- const SCOPE = "openid profile email offline_access api.responses.write";
This will break your OAuth flow entirely. OpenAI's auth server does not accept api.responses.write as a valid scope (at least for the ChatGPT Plus OAuth client). After this change, re-authentication fails with a Missing authorization code error — you won't even be able to use gpt-5.3-codex anymore.
The correct fix is to change the API endpoint (as described in the Workaround above), not the OAuth scope.
After this change:
I did not re-run OAuth or change scopes.
New sessions use gpt-5.4 (no 401), with stopReason: "stop".
Web UI Usage shows GPT-5.4 usage increasing.

Bug type
Regression (worked before, now fails)
Summary
GPT-5.4 via openai-codex OAuth 401s and silently falls back to gpt-5.3-codex
Steps to reproduce
2026.3.2and configure theopenai-codexprovider.openclaw auth openai-codex) using a ChatGPT Plus account that can use GPT-5.4 in the Codex CLI.openclaw.json/ Web UI, set the default model toopenai-codex/gpt-5.4with fallbackopenai-codex/gpt-5.3-codex.message.modelfields for assistant messages.Expected behavior
openai-codex/gpt-5.4via ChatGPT / Codex OAuth should succeed without requiringapi.responses.write.message.modelshould begpt-5.4(notgpt-5.3-codex) as long as 5.4 is configured as the primary model.Actual behavior
openclaw statusshowsopenai-codex/gpt-5.4 (1050k ctx)as default, fallbackopenai-codex/gpt-5.3-codex.When the agent tries to use GPT-5.4, gateway calls fail with:
Runtime then silently falls back to gpt-5.3-codex for responses.
Transcript shows:
Attempts with GPT-5.4 have stopReason: error.
Successful assistant messages all have message.model = gpt-5.3-codex.
Meanwhile, the Codex CLI (same account, same time) can use GPT-5.4 normally.
OpenClaw version
2026.3.2
Operating system
Windows 11 + WSL2 (Ubuntu)
Install method
npm install -g openclaw (global install inside WSL2)
Logs, screenshots, and evidence
Impact and severity
gpt-5.3-codex, which can mislead users and make debugging harder.Additional information
Root cause (my investigation)
The
openai-codexprovider currently resolves to:This sends GPT-5.4 requests to /v1/responses, which requires the api.responses.write scope.
ChatGPT / Codex OAuth tokens used by OpenClaw do not include api.responses.write, so the API returns 401.
In contrast, the Codex CLI uses https://chatgpt.com/backend-api/codex/responses with the same ChatGPT Plus token and works fine.
Workaround that fixes it for me
I was able to make GPT-5.4 work via Codex OAuth by overriding the openai-codex provider in ~/.openclaw/agents/main/agent/models.json:
"openai-codex": {
"baseUrl": "https://chatgpt.com/backend-api",
"api": "openai-codex-responses",
"models": [
{
"id": "gpt-5.4",
"name": "GPT-5.4",
"api": "openai-codex-responses",
"reasoning": true,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 1050000,
"maxTokens": 128000
},
{
"id": "gpt-5.3-codex",
"name": "GPT-5.3 Codex",
"api": "openai-codex-responses",
"reasoning": true,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 272000,
"maxTokens": 128000
},
{
"id": "gpt-5.1-codex-mini",
"name": "GPT-5.1 Codex Mini",
"api": "openai-codex-responses",
"reasoning": true,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 262144,
"maxTokens": 128000
}
]
}
You might instinctively try to add api.responses.write to the OAuth scope in openai-codex.js:
DO NOT DO THIS!
This will break your OAuth flow entirely. OpenAI's auth server does not accept api.responses.write as a valid scope (at least for the ChatGPT Plus OAuth client). After this change, re-authentication fails with a Missing authorization code error — you won't even be able to use gpt-5.3-codex anymore.
The correct fix is to change the API endpoint (as described in the Workaround above), not the OAuth scope.
After this change:
I did not re-run OAuth or change scopes.
New sessions use gpt-5.4 (no 401), with stopReason: "stop".
Web UI Usage shows GPT-5.4 usage increasing.