Bug type
Regression (worked before, now fails)
Summary
Summary
On macOS, OpenClaw can complete openai-codex OAuth successfully and enumerate Codex models, but actual requests to Codex models never complete. The agent always falls back to Gemini.
Environment
- OpenClaw:
2026.3.7 (GitHub source build)
- macOS (Apple Silicon)
- Proxy: Clash
127.0.0.1:7897
- OpenAI Codex OAuth login succeeds
- Official
codex CLI works with the same account and proxy
Steps to reproduce
- Install OpenClaw from GitHub source so active version is
2026.3.7
- Set proxy env:
HTTP_PROXY=http://127.0.0.1:7897
HTTPS_PROXY=http://127.0.0.1:7897
ALL_PROXY=http://127.0.0.1:7897
NO_PROXY=127.0.0.1,localhost
- Run:
openclaw models auth login --provider openai-codex
- Complete browser OAuth successfully
- Set main model to one of:
openai-codex/gpt-5.1-codex-mini
openai-codex/gpt-5.3-codex
openai-codex/gpt-5.4
- Restart gateway
- Run:
openclaw agent --agent main --message 'ping' --json
Expected behavior
OpenClaw should answer using the configured openai-codex/* model.
Actual behavior
OpenClaw starts with the Codex model configured, but the request does not complete through Codex. The final reply always comes from:
google-gemini-cli/gemini-3-flash-preview
Evidence
- OAuth succeeds:
Auth profile: openai-codex:default (openai-codex/oauth)
- Codex models enumerate correctly:
openai-codex/gpt-5.1-codex-mini
openai-codex/gpt-5.3-codex
openai-codex/gpt-5.4
- Gateway confirms configured model:
agent model: openai-codex/gpt-5.4
- But actual agent response metadata shows:
provider: google-gemini-cli
model: gemini-3-flash-preview
OpenClaw version
2026.3.7
Operating system
macos 14.5
Install method
No response
Logs, screenshots, and evidence
Impact and severity
No response
Additional information
No response
Bug type
Regression (worked before, now fails)
Summary
Summary
On macOS, OpenClaw can complete
openai-codexOAuth successfully and enumerate Codex models, but actual requests to Codex models never complete. The agent always falls back to Gemini.Environment
2026.3.7(GitHub source build)127.0.0.1:7897codexCLI works with the same account and proxySteps to reproduce
2026.3.7HTTP_PROXY=http://127.0.0.1:7897HTTPS_PROXY=http://127.0.0.1:7897ALL_PROXY=http://127.0.0.1:7897NO_PROXY=127.0.0.1,localhostopenclaw models auth login --provider openai-codexopenai-codex/gpt-5.1-codex-miniopenai-codex/gpt-5.3-codexopenai-codex/gpt-5.4openclaw agent --agent main --message 'ping' --jsonExpected behavior
OpenClaw should answer using the configured
openai-codex/*model.Actual behavior
OpenClaw starts with the Codex model configured, but the request does not complete through Codex. The final reply always comes from:
google-gemini-cli/gemini-3-flash-previewEvidence
Auth profile: openai-codex:default (openai-codex/oauth)openai-codex/gpt-5.1-codex-miniopenai-codex/gpt-5.3-codexopenai-codex/gpt-5.4agent model: openai-codex/gpt-5.4provider: google-gemini-climodel: gemini-3-flash-previewOpenClaw version
2026.3.7
Operating system
macos 14.5
Install method
No response
Logs, screenshots, and evidence
Impact and severity
No response
Additional information
No response