Bug type
Behavior bug (incorrect output/state without crash)
Beta release blocker
No
Summary
openai/gpt-image-2 over OpenAI Codex OAuth fails with HTTP 403 when models.providers.openai-codex.baseUrl is the legacy https://chatgpt.com/backend-api (no /codex segment), because the image path builds the request URL verbatim while the chat path silently canonicalizes it.
Steps to reproduce
- Run OpenClaw
2026.4.23 (a979721) with a working openai-codex OAuth profile and no OPENAI_API_KEY.
- In
~/.openclaw/openclaw.json, set models.providers.openai-codex.baseUrl to "https://chatgpt.com/backend-api" (legacy form, no /codex).
- Run
openclaw capability image generate --model openai/gpt-image-2 --prompt "test".
Expected behavior
Image is generated by gpt-image-2 over Codex OAuth, identical to the chat path which already accepts the legacy baseUrl form. The chat path canonicalizes the URL in extensions/openai/openai-codex-provider.ts:147 (normalizeCodexTransportFields) and successfully POSTs to https://chatgpt.com/backend-api/codex/responses.
Actual behavior
CLI succeeds (exit 0) but logs show:
[image-generation/openai] image auth selected: provider=openai-codex mode=oauth transport=codex-responses requestedModel=gpt-image-2 responsesModel=gpt-5.4 timeoutMs=180000
[image-generation] candidate failed: openai/gpt-image-2: OpenAI Codex image generation failed (HTTP 403): <html> <head> ... </head>
model: fal-ai/flux/dev
The image path constructs ${baseUrl}/responses at extensions/openai/image-generation-provider.ts:567, which yields https://chatgpt.com/backend-api/responses. OpenAI retired the /backend-api/responses alias server-side on 2026-04 (acknowledged in extensions/openai/base-url.test.ts:19), so the request 403s. With auto-fallback active, the agent silently shifts to fal-ai/flux/dev and the user receives an image that is not from gpt-image-2.
OpenClaw version
2026.4.23 (a979721)
Operating system
Ubuntu 24.04 (Linux 6.8.0-110-generic)
Install method
npm global (npm install -g openclaw)
Model
openai/gpt-image-2
Provider / routing chain
openclaw -> openai-codex (OAuth) -> chatgpt.com Codex Responses backend
Additional provider/model setup details
Sammy auth profile, OpenAI Codex OAuth, ChatGPT Pro plan. Relevant config (redacted):
{
"models": {
"providers": {
"openai-codex": {
"baseUrl": "https://chatgpt.com/backend-api",
"api": "openai-codex-responses"
}
}
},
"agents": {
"defaults": {
"imageGenerationModel": { "primary": "openai/gpt-image-2" }
}
}
}
models.providers.openai is not set. OPENAI_API_KEY is not set. Codex OAuth is freshly refreshed and confirmed working for chat (gpt-5.4, gpt-5.5).
Logs, screenshots, and evidence
Side-by-side reproduction:
| baseUrl in config |
Dist patched? |
Result |
https://chatgpt.com/backend-api |
no |
HTTP 403, silent fallback to fal-ai/flux/dev |
https://chatgpt.com/backend-api/codex |
no |
HTTP 200, model: gpt-image-2, valid PNG |
https://chatgpt.com/backend-api |
yes (local canonicalization) |
HTTP 200, model: gpt-image-2, valid PNG |
https://chatgpt.com/backend-api/codex |
yes |
HTTP 200, model: gpt-image-2, valid PNG |
Code references:
- Image path (no normalization):
extensions/openai/image-generation-provider.ts:534-548, request URL built at line 567 as ${baseUrl}/responses.
- Chat path (with normalization):
extensions/openai/openai-codex-provider.ts:129-149, line 147 hard-overrides any isOpenAICodexBaseUrl match to OPENAI_CODEX_BASE_URL.
- Existing helper:
extensions/openai/base-url.ts:11-18 accepts both /backend-api and /backend-api/codex as isOpenAICodexBaseUrl(true).
- Server-side context:
extensions/openai/base-url.test.ts:18-19 notes "OpenAI removed the /backend-api/responses alias server-side on 2026-04".
Related closed issues that touched adjacent surface but did not fix this normalization gap:
Impact and severity
- Affected users: any OpenClaw user on
2026.4.23 (and likely earlier 2026.4.x versions that include the Codex OAuth image route at c84a2f5244) who has models.providers.openai-codex.baseUrl set to the legacy https://chatgpt.com/backend-api form. This shape is common because the chat path historically accepted it and isOpenAICodexBaseUrl still recognizes it as valid.
- Severity: blocks the documented
openai/gpt-image-2 via Codex OAuth workflow advertised in docs/tools/image-generation.md. Users see images returned but they come from a different provider, which is worse than a hard failure because it appears to work.
- Frequency: every image_generate call while the legacy baseUrl is configured.
- Consequence: missed product capability (gpt-image-2 quality + Codex billing entitlement), confusing user experience because the silent fallback hides the failure, wasted Codex OAuth refresh cycles while debugging.
Suggested fix
Mirror the chat-path canonicalization in the image path. PR coming in a follow-up: add canonicalizeCodexResponsesBaseUrl to extensions/openai/base-url.ts and call it inside generateOpenAICodexImage before passing baseUrl into resolveProviderHttpRequestConfig. Non-Codex baseUrls (private proxies, Azure-style endpoints) pass through unchanged.
Bug type
Behavior bug (incorrect output/state without crash)
Beta release blocker
No
Summary
openai/gpt-image-2over OpenAI Codex OAuth fails with HTTP 403 whenmodels.providers.openai-codex.baseUrlis the legacyhttps://chatgpt.com/backend-api(no/codexsegment), because the image path builds the request URL verbatim while the chat path silently canonicalizes it.Steps to reproduce
2026.4.23 (a979721)with a workingopenai-codexOAuth profile and noOPENAI_API_KEY.~/.openclaw/openclaw.json, setmodels.providers.openai-codex.baseUrlto"https://chatgpt.com/backend-api"(legacy form, no/codex).openclaw capability image generate --model openai/gpt-image-2 --prompt "test".Expected behavior
Image is generated by
gpt-image-2over Codex OAuth, identical to the chat path which already accepts the legacy baseUrl form. The chat path canonicalizes the URL inextensions/openai/openai-codex-provider.ts:147(normalizeCodexTransportFields) and successfully POSTs tohttps://chatgpt.com/backend-api/codex/responses.Actual behavior
CLI succeeds (exit 0) but logs show:
The image path constructs
${baseUrl}/responsesatextensions/openai/image-generation-provider.ts:567, which yieldshttps://chatgpt.com/backend-api/responses. OpenAI retired the/backend-api/responsesalias server-side on 2026-04 (acknowledged inextensions/openai/base-url.test.ts:19), so the request 403s. With auto-fallback active, the agent silently shifts tofal-ai/flux/devand the user receives an image that is not fromgpt-image-2.OpenClaw version
2026.4.23 (a979721)
Operating system
Ubuntu 24.04 (Linux 6.8.0-110-generic)
Install method
npm global (
npm install -g openclaw)Model
openai/gpt-image-2
Provider / routing chain
openclaw -> openai-codex (OAuth) -> chatgpt.com Codex Responses backend
Additional provider/model setup details
Sammy auth profile, OpenAI Codex OAuth, ChatGPT Pro plan. Relevant config (redacted):
{ "models": { "providers": { "openai-codex": { "baseUrl": "https://chatgpt.com/backend-api", "api": "openai-codex-responses" } } }, "agents": { "defaults": { "imageGenerationModel": { "primary": "openai/gpt-image-2" } } } }models.providers.openaiis not set.OPENAI_API_KEYis not set. Codex OAuth is freshly refreshed and confirmed working for chat (gpt-5.4,gpt-5.5).Logs, screenshots, and evidence
Side-by-side reproduction:
https://chatgpt.com/backend-apifal-ai/flux/devhttps://chatgpt.com/backend-api/codexmodel: gpt-image-2, valid PNGhttps://chatgpt.com/backend-apimodel: gpt-image-2, valid PNGhttps://chatgpt.com/backend-api/codexmodel: gpt-image-2, valid PNGCode references:
extensions/openai/image-generation-provider.ts:534-548, request URL built at line 567 as${baseUrl}/responses.extensions/openai/openai-codex-provider.ts:129-149, line 147 hard-overrides anyisOpenAICodexBaseUrlmatch toOPENAI_CODEX_BASE_URL.extensions/openai/base-url.ts:11-18accepts both/backend-apiand/backend-api/codexasisOpenAICodexBaseUrl(true).extensions/openai/base-url.test.ts:18-19notes "OpenAI removed the /backend-api/responses alias server-side on 2026-04".Related closed issues that touched adjacent surface but did not fix this normalization gap:
openclaud11-sysreporting the same/codexbaseUrl gap on a different install, but the resolution treated it as user config.maininfbf8b216c6(post-2026.4.23). Closing comment frames the remaining 403 as account/backend gating, but the evidence above proves the same Codex OAuth profile reaches/backend-api/codex/responsessuccessfully when the URL is canonical.Impact and severity
2026.4.23(and likely earlier2026.4.xversions that include the Codex OAuth image route atc84a2f5244) who hasmodels.providers.openai-codex.baseUrlset to the legacyhttps://chatgpt.com/backend-apiform. This shape is common because the chat path historically accepted it andisOpenAICodexBaseUrlstill recognizes it as valid.openai/gpt-image-2 via Codex OAuthworkflow advertised indocs/tools/image-generation.md. Users see images returned but they come from a different provider, which is worse than a hard failure because it appears to work.Suggested fix
Mirror the chat-path canonicalization in the image path. PR coming in a follow-up: add
canonicalizeCodexResponsesBaseUrltoextensions/openai/base-url.tsand call it insidegenerateOpenAICodexImagebefore passing baseUrl intoresolveProviderHttpRequestConfig. Non-Codex baseUrls (private proxies, Azure-style endpoints) pass through unchanged.