fix(openai-codex): use /backend-api/codex/ base URL#69336
fix(openai-codex): use /backend-api/codex/ base URL#69336steipete merged 1 commit intoopenclaw:mainfrom
Conversation
OpenAI removed the /backend-api/responses alias on chatgpt.com server-side.
The OpenAI SDK appends /responses to the configured baseUrl, so OpenClaw's
current baseUrl ("https://chatgpt.com/backend-api") now resolves to
/backend-api/responses and hits a Cloudflare HTML 403 block page. The
provider's 403+HTML error classifier then surfaces this as an auth-scope
failure, triggering fruitless OAuth re-login loops for every GPT-5.4
sub-agent call.
- Point OPENAI_CODEX_BASE_URL at https://chatgpt.com/backend-api/codex
(both the catalog constant and the sibling local constant in the provider).
- Extend isOpenAICodexBaseUrl to accept the new /codex segment while keeping
the legacy path recognized so pre-existing user configs and persisted
model metadata still round-trip through the normalizer correctly.
- Add positive-case test coverage for the new base URL; update existing
normalization tests whose expected canonical output now includes /codex.
Verified with live curl using the exact OAuth access token stored by
OpenClaw: the /codex/responses path returns HTTP 200 with streaming SSE,
while the old /responses alias returns HTTP 403 HTML regardless of auth
headers. Scoped tests (base-url, openai-codex-provider, transport-policy,
openai-provider, index) pass; pnpm tsgo and pnpm build are clean.
Greptile SummaryThis hotfix updates the OpenAI Codex base URL from Confidence Score: 5/5Safe to merge — targeted hotfix restoring a broken endpoint with full backward compatibility. All three code changes are minimal and precise: two constant updates and one regex extension. The regex correctly handles all expected forms (new canonical, legacy, v1 variants, trailing slashes) and rejects non-matching paths. Backward-compat normalization via No files require special attention. Reviews (1): Last reviewed commit: "fix(openai-codex): use /backend-api/code..." | Re-trigger Greptile |
|
Landed via rebase onto main.
Thanks @mzogithub! |
TL;DR
OpenAI removed the
/backend-api/responsesalias onchatgpt.comserver-side. The OpenAI SDK appends/responsesto the configuredbaseUrl, so OpenClaw's currentbaseUrl(https://chatgpt.com/backend-api) now resolves to/backend-api/responsesand hits a Cloudflare HTML 403 block page. OpenClaw's error classifier then surfaces this as an auth-scope failure, triggering fruitless OAuth re-login loops for everyopenai-codex/gpt-5.4call.This PR points the base URL at
https://chatgpt.com/backend-api/codexso the SDK builds.../backend-api/codex/responses, which still returns200 OKwith normal SSE streaming.Symptom
openai-codex/gpt-5.4(and siblinggpt-5.4-pro/gpt-5.4-mini) call fails with:openai-codex/gpt-5.4hits the same 403 and falls back to raw truncation.HTTP 403+ HTML body on POSTs tohttps://chatgpt.com/backend-api/responses.Root cause (verified with live curl)
Using the exact OAuth access token OpenClaw persists in
auth-profiles.json:Same
Authorization: Bearer <token>header in both cases. Only the path differs. Cloudflare — not OpenAI auth — is gating the legacy alias. OpenAI apparently removed the/backend-api/responsesalias some time in the last ~24 hours (2026-04-19/20 window); only/backend-api/codex/responsesis live now.The error then surfaces as "re-authenticate" because OpenClaw's provider-level
HTML body + 403classifier treats Cloudflare block pages as an auth-scope failure. The token is fine — the URL is wrong.Fix
Three source changes:
extensions/openai/openai-codex-catalog.ts— changeOPENAI_CODEX_BASE_URLfrom"https://chatgpt.com/backend-api"to"https://chatgpt.com/backend-api/codex".extensions/openai/openai-codex-provider.ts— same change to the sibling local constant.extensions/openai/base-url.ts— extendisOpenAICodexBaseUrlso both the new canonical form AND the legacy form are recognized as a Codex baseURL. Backward compatibility matters here because user configs and persisted model metadata may still contain the legacy URL; the existingnormalizeCodexTransportflow then round-trips them to the new canonical form.Backward compatibility
baseUrl: "https://chatgpt.com/backend-api"are still recognized byisOpenAICodexBaseUrl, so they still route through the Codex transport and get normalized to the new canonical form bynormalizeCodexTransport/normalizeResolvedModelon the next model resolution.openclaw.json.openai-codex/gpt-5.4requests start succeeding the moment the gateway picks up the new build.Tests
Added positive-case coverage for the new base URL:
extensions/openai/base-url.test.tshttps://chatgpt.com/backend-api/codex→truehttps://chatgpt.com/backend-api/codex/→truehttps://chatgpt.com/backend-api/codex/v1→truehttps://chatgpt.com/backend-api/codex/v1/→truetrue(backward compat).https://chatgpt.com/backend-api/codex/v2→false(new negative case).Updated existing assertions whose expected canonical URL now includes
/codex:extensions/openai/openai-codex-provider.test.ts— three cases:normalizeResolvedModeldefault-api filling,normalizeResolvedModelstale/v1→ canonical, andnormalizeTransportstale/v1→ canonical. The stale-/v1migration tests continue to cover the same behavior; they now assert migration to the new canonical form.extensions/openai/openai-provider.test.ts— oneresolveDynamicModelfallback case (empty registry → uses the constant directly).Tests that pass the legacy URL as input but assert on non-URL fields (tool schema normalization, transport-policy headers, etc.) were left alone — they exercise the backward-compat path and continue to pass.
Verification steps
Local results against
origin/mainat442deb0816:pnpm tsgo— clean.pnpm build— clean.base-url.test.ts— 4/4 passing.openai-codex-provider.test.ts— 16/16 passing.transport-policy.test.ts— 5/5 passing.openai-provider.test.ts(touched) — 25/25 passing.index.test.ts(Codex tool-schema normalization, smoke) — passing.Notes for reviewers
openai-codex/gpt-5.4is currently broken by the upstream Cloudflare block. Each deployment-owner is presumably burning through fruitless re-auth attempts until thebaseUrlconstant lands.isOpenAICodexBaseUrlregex intentionally keeps the legacy shape recognized. Once this has been out long enough that no persisted configs still carryhttps://chatgpt.com/backend-api, the legacy branch of the regex can be removed in a follow-up; the runtime normalizer already upgrades on the first round-trip.