Summary
openai-codex runtime requests can fail in proxy-only environments even when:
- ChatGPT Plus OAuth login succeeds
openclaw models status shows a healthy openai-codex profile
- the same machine can reach
https://chatgpt.com/backend-api/codex/responses successfully via curl
On my setup, OpenClaw would repeatedly return fetch failed for runtime Codex requests until I forced Node to honor env proxy routing with NODE_USE_ENV_PROXY=1.
This looks related to, but distinct from, the OAuth proxy issues in:
and likely related to the runtime symptom in:
Environment
- OpenClaw:
2026.3.8
- Node:
22.22.0
- OS: Linux
- Auth:
openai-codex via ChatGPT Plus OAuth
- Model:
openai-codex/gpt-5.4
- Proxy env present:
HTTP_PROXY=http://192.168.31.201:7890
HTTPS_PROXY=http://192.168.31.201:7890
NO_PROXY=localhost,127.0.0.1
Symptoms
Runtime requests fail with session log entries like:
errorMessage: "fetch failed"
provider: "openai-codex"
model: "gpt-5.4"
OAuth itself was not the issue:
gh auth equivalent for OpenClaw showed the openai-codex:default OAuth profile as healthy
openclaw models list included openai-codex/gpt-5.4
openclaw models status --json showed the OAuth profile status as ok
Important Reproduction Detail
Plain Node fetch and curl behaved differently on the same host.
1. Without NODE_USE_ENV_PROXY=1, Node fetch did not actually use the env proxy
node -e "const ac=new AbortController(); setTimeout(()=>ac.abort(),10000); fetch('https://chatgpt.com/backend-api/models',{signal:ac.signal,headers:{Authorization:'Bearer invalid',Origin:'https://chatgpt.com',Referer:'https://chatgpt.com/','User-Agent':'Mozilla/5.0'}}).then(async r=>{console.log('status',r.status)}).catch(e=>{console.error(e.name,e.message)})"
Observed:
AbortError This operation was aborted
2. With NODE_USE_ENV_PROXY=1, the same fetch changed behavior immediately
NODE_USE_ENV_PROXY=1 node -e "const ac=new AbortController(); setTimeout(()=>ac.abort(),10000); fetch('https://chatgpt.com/backend-api/models',{signal:ac.signal,headers:{Authorization:'Bearer invalid',Origin:'https://chatgpt.com',Referer:'https://chatgpt.com/','User-Agent':'Mozilla/5.0'}}).then(async r=>{console.log('status',r.status)}).catch(e=>{console.error(e.name,e.message)})"
Observed:
That strongly suggests the default OpenClaw runtime path was not honoring the proxy env for Node/undici fetch.
Additional Observation: request headers matter on chatgpt.com/backend-api
I also tested the same Plus OAuth token outside OpenClaw.
This request succeeded with HTTP/2 200:
curl --proxy http://192.168.31.201:7890 \
'https://chatgpt.com/backend-api/codex/responses' \
-H "Authorization: Bearer <access_token>" \
-H "chatgpt-account-id: <account_id>" \
-H 'OpenAI-Beta: responses=experimental' \
-H 'Origin: https://chatgpt.com' \
-H 'Referer: https://chatgpt.com/' \
-H 'User-Agent: Mozilla/5.0 ...' \
-H 'accept: text/event-stream' \
-H 'content-type: application/json' \
--data '{"model":"gpt-5.4","store":false,"stream":true,"instructions":"You are a helpful assistant.","input":[{"role":"user","content":[{"type":"input_text","text":"Reply with OK only."}]}],"text":{"verbosity":"medium"},"include":["reasoning.encrypted_content"],"tool_choice":"auto","parallel_tool_calls":true}'
Response headers included:
HTTP/2 200
x-codex-plan-type: plus
x-oai-request-id: ...
So this was not a bad account, bad model, or unsupported Plus plan.
What fixed it locally
I needed both of these changes to make OpenClaw runtime Codex requests work reliably:
1. Force Node runtime to honor env proxy
I locally set NODE_USE_ENV_PROXY=1 before the OpenClaw gateway/runtime starts.
After that, the gateway process environment included:
HTTP_PROXY=http://192.168.31.201:7890
HTTPS_PROXY=http://192.168.31.201:7890
NODE_USE_ENV_PROXY=1
2. Add browser-like headers to the Codex provider request path
In @mariozechner/pi-ai/dist/providers/openai-codex-responses.js, I locally patched buildHeaders() to include:
Origin: https://chatgpt.com
Referer: https://chatgpt.com/
- browser-style
User-Agent
Accept-Language
Verification
After those changes, this worked successfully:
openclaw agent --agent main --message '只回复OK' --json --timeout 90
Observed result:
{
"status": "ok",
"result": {
"payloads": [{ "text": "OK" }],
"meta": {
"agentMeta": {
"provider": "openai-codex",
"model": "gpt-5.4"
}
}
}
}
Likely Root Cause
There seem to be two runtime compatibility issues here:
- OpenClaw's
openai-codex runtime path depends on Node/undici fetch behavior that does not automatically honor HTTP_PROXY / HTTPS_PROXY unless NODE_USE_ENV_PROXY=1 is set.
chatgpt.com/backend-api appears more sensitive than ordinary API endpoints to request shape / frontend-like headers, especially behind proxy setups.
Expected Behavior
OpenClaw should make openai-codex runtime requests work in the same proxy environment where:
- OAuth login already succeeded
curl to chatgpt.com/backend-api/codex/responses succeeds
- the same ChatGPT Plus account is valid
At minimum, OpenClaw should either:
- automatically enable env proxy support for runtime fetch when proxy env vars are present, or
- clearly document that
NODE_USE_ENV_PROXY=1 is required for proxied openai-codex runtime usage
Suggested Fix Direction
- Ensure gateway/runtime processes set up proxy-aware undici/fetch behavior consistently, not only selected flows
- Review the default headers for
openai-codex runtime requests in @mariozechner/pi-ai
- If browser-style headers are intentionally required by
chatgpt.com/backend-api, encode that in the provider instead of relying on generic defaults
Summary
openai-codexruntime requests can fail in proxy-only environments even when:openclaw models statusshows a healthyopenai-codexprofilehttps://chatgpt.com/backend-api/codex/responsessuccessfully viacurlOn my setup, OpenClaw would repeatedly return
fetch failedfor runtime Codex requests until I forced Node to honor env proxy routing withNODE_USE_ENV_PROXY=1.This looks related to, but distinct from, the OAuth proxy issues in:
and likely related to the runtime symptom in:
Environment
2026.3.822.22.0openai-codexvia ChatGPT Plus OAuthopenai-codex/gpt-5.4HTTP_PROXY=http://192.168.31.201:7890HTTPS_PROXY=http://192.168.31.201:7890NO_PROXY=localhost,127.0.0.1Symptoms
Runtime requests fail with session log entries like:
OAuth itself was not the issue:
gh authequivalent for OpenClaw showed theopenai-codex:defaultOAuth profile as healthyopenclaw models listincludedopenai-codex/gpt-5.4openclaw models status --jsonshowed the OAuth profile status asokImportant Reproduction Detail
Plain Node fetch and curl behaved differently on the same host.
1. Without
NODE_USE_ENV_PROXY=1, Node fetch did not actually use the env proxynode -e "const ac=new AbortController(); setTimeout(()=>ac.abort(),10000); fetch('https://chatgpt.com/backend-api/models',{signal:ac.signal,headers:{Authorization:'Bearer invalid',Origin:'https://chatgpt.com',Referer:'https://chatgpt.com/','User-Agent':'Mozilla/5.0'}}).then(async r=>{console.log('status',r.status)}).catch(e=>{console.error(e.name,e.message)})"Observed:
2. With
NODE_USE_ENV_PROXY=1, the same fetch changed behavior immediatelyNODE_USE_ENV_PROXY=1 node -e "const ac=new AbortController(); setTimeout(()=>ac.abort(),10000); fetch('https://chatgpt.com/backend-api/models',{signal:ac.signal,headers:{Authorization:'Bearer invalid',Origin:'https://chatgpt.com',Referer:'https://chatgpt.com/','User-Agent':'Mozilla/5.0'}}).then(async r=>{console.log('status',r.status)}).catch(e=>{console.error(e.name,e.message)})"Observed:
That strongly suggests the default OpenClaw runtime path was not honoring the proxy env for Node/undici fetch.
Additional Observation: request headers matter on
chatgpt.com/backend-apiI also tested the same Plus OAuth token outside OpenClaw.
This request succeeded with
HTTP/2 200:Response headers included:
So this was not a bad account, bad model, or unsupported Plus plan.
What fixed it locally
I needed both of these changes to make OpenClaw runtime Codex requests work reliably:
1. Force Node runtime to honor env proxy
I locally set
NODE_USE_ENV_PROXY=1before the OpenClaw gateway/runtime starts.After that, the gateway process environment included:
2. Add browser-like headers to the Codex provider request path
In
@mariozechner/pi-ai/dist/providers/openai-codex-responses.js, I locally patchedbuildHeaders()to include:Origin: https://chatgpt.comReferer: https://chatgpt.com/User-AgentAccept-LanguageVerification
After those changes, this worked successfully:
openclaw agent --agent main --message '只回复OK' --json --timeout 90Observed result:
{ "status": "ok", "result": { "payloads": [{ "text": "OK" }], "meta": { "agentMeta": { "provider": "openai-codex", "model": "gpt-5.4" } } } }Likely Root Cause
There seem to be two runtime compatibility issues here:
openai-codexruntime path depends on Node/undici fetch behavior that does not automatically honorHTTP_PROXY/HTTPS_PROXYunlessNODE_USE_ENV_PROXY=1is set.chatgpt.com/backend-apiappears more sensitive than ordinary API endpoints to request shape / frontend-like headers, especially behind proxy setups.Expected Behavior
OpenClaw should make
openai-codexruntime requests work in the same proxy environment where:curltochatgpt.com/backend-api/codex/responsessucceedsAt minimum, OpenClaw should either:
NODE_USE_ENV_PROXY=1is required for proxiedopenai-codexruntime usageSuggested Fix Direction
openai-codexruntime requests in@mariozechner/pi-aichatgpt.com/backend-api, encode that in the provider instead of relying on generic defaults