Skip to content

[Bug]: openai/gpt-image-2 via Codex OAuth fails with HTTP 403 when openai-codex.baseUrl is the legacy /backend-api form (no /codex) #71460

@GodsBoy

Description

@GodsBoy

Bug type

Behavior bug (incorrect output/state without crash)

Beta release blocker

No

Summary

openai/gpt-image-2 over OpenAI Codex OAuth fails with HTTP 403 when models.providers.openai-codex.baseUrl is the legacy https://chatgpt.com/backend-api (no /codex segment), because the image path builds the request URL verbatim while the chat path silently canonicalizes it.

Steps to reproduce

  1. Run OpenClaw 2026.4.23 (a979721) with a working openai-codex OAuth profile and no OPENAI_API_KEY.
  2. In ~/.openclaw/openclaw.json, set models.providers.openai-codex.baseUrl to "https://chatgpt.com/backend-api" (legacy form, no /codex).
  3. Run openclaw capability image generate --model openai/gpt-image-2 --prompt "test".

Expected behavior

Image is generated by gpt-image-2 over Codex OAuth, identical to the chat path which already accepts the legacy baseUrl form. The chat path canonicalizes the URL in extensions/openai/openai-codex-provider.ts:147 (normalizeCodexTransportFields) and successfully POSTs to https://chatgpt.com/backend-api/codex/responses.

Actual behavior

CLI succeeds (exit 0) but logs show:

[image-generation/openai] image auth selected: provider=openai-codex mode=oauth transport=codex-responses requestedModel=gpt-image-2 responsesModel=gpt-5.4 timeoutMs=180000
[image-generation] candidate failed: openai/gpt-image-2: OpenAI Codex image generation failed (HTTP 403): <html> <head> ... </head>
model: fal-ai/flux/dev

The image path constructs ${baseUrl}/responses at extensions/openai/image-generation-provider.ts:567, which yields https://chatgpt.com/backend-api/responses. OpenAI retired the /backend-api/responses alias server-side on 2026-04 (acknowledged in extensions/openai/base-url.test.ts:19), so the request 403s. With auto-fallback active, the agent silently shifts to fal-ai/flux/dev and the user receives an image that is not from gpt-image-2.

OpenClaw version

2026.4.23 (a979721)

Operating system

Ubuntu 24.04 (Linux 6.8.0-110-generic)

Install method

npm global (npm install -g openclaw)

Model

openai/gpt-image-2

Provider / routing chain

openclaw -> openai-codex (OAuth) -> chatgpt.com Codex Responses backend

Additional provider/model setup details

Sammy auth profile, OpenAI Codex OAuth, ChatGPT Pro plan. Relevant config (redacted):

{
  "models": {
    "providers": {
      "openai-codex": {
        "baseUrl": "https://chatgpt.com/backend-api",
        "api": "openai-codex-responses"
      }
    }
  },
  "agents": {
    "defaults": {
      "imageGenerationModel": { "primary": "openai/gpt-image-2" }
    }
  }
}

models.providers.openai is not set. OPENAI_API_KEY is not set. Codex OAuth is freshly refreshed and confirmed working for chat (gpt-5.4, gpt-5.5).

Logs, screenshots, and evidence

Side-by-side reproduction:

baseUrl in config Dist patched? Result
https://chatgpt.com/backend-api no HTTP 403, silent fallback to fal-ai/flux/dev
https://chatgpt.com/backend-api/codex no HTTP 200, model: gpt-image-2, valid PNG
https://chatgpt.com/backend-api yes (local canonicalization) HTTP 200, model: gpt-image-2, valid PNG
https://chatgpt.com/backend-api/codex yes HTTP 200, model: gpt-image-2, valid PNG

Code references:

  • Image path (no normalization): extensions/openai/image-generation-provider.ts:534-548, request URL built at line 567 as ${baseUrl}/responses.
  • Chat path (with normalization): extensions/openai/openai-codex-provider.ts:129-149, line 147 hard-overrides any isOpenAICodexBaseUrl match to OPENAI_CODEX_BASE_URL.
  • Existing helper: extensions/openai/base-url.ts:11-18 accepts both /backend-api and /backend-api/codex as isOpenAICodexBaseUrl(true).
  • Server-side context: extensions/openai/base-url.test.ts:18-19 notes "OpenAI removed the /backend-api/responses alias server-side on 2026-04".

Related closed issues that touched adjacent surface but did not fix this normalization gap:

Impact and severity

  • Affected users: any OpenClaw user on 2026.4.23 (and likely earlier 2026.4.x versions that include the Codex OAuth image route at c84a2f5244) who has models.providers.openai-codex.baseUrl set to the legacy https://chatgpt.com/backend-api form. This shape is common because the chat path historically accepted it and isOpenAICodexBaseUrl still recognizes it as valid.
  • Severity: blocks the documented openai/gpt-image-2 via Codex OAuth workflow advertised in docs/tools/image-generation.md. Users see images returned but they come from a different provider, which is worse than a hard failure because it appears to work.
  • Frequency: every image_generate call while the legacy baseUrl is configured.
  • Consequence: missed product capability (gpt-image-2 quality + Codex billing entitlement), confusing user experience because the silent fallback hides the failure, wasted Codex OAuth refresh cycles while debugging.

Suggested fix

Mirror the chat-path canonicalization in the image path. PR coming in a follow-up: add canonicalizeCodexResponsesBaseUrl to extensions/openai/base-url.ts and call it inside generateOpenAICodexImage before passing baseUrl into resolveProviderHttpRequestConfig. Non-Codex baseUrls (private proxies, Azure-style endpoints) pass through unchanged.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions