Hi, I'm running OpenClaw 2026.2.24 on a Vultr VPS with Telegram as my main channel.
My auth is ChatGPT Pro via OAuth (openai-codex provider).
The problem: GPT-4o works perfectly when using openai-completions API, but every GPT-5+ model fails immediately with:
Failed to extract accountId from token
Models tested:
- gpt-4o → ✅ Works (via openai-completions)
- gpt-5 → ❌ Failed to extract accountId from token
- gpt-5-mini → ❌ Same error
- gpt-5.2 → ❌ Same error
- gpt-4.1 → ❌ Same error
What I've tried:
- Updated OpenClaw from 2026.2.23 to 2026.2.24
- Re-authenticated via
openclaw agents add main (OAuth flow)
- Changed API endpoint from
openai-responses to openai-completions in models.json (this is what fixed GPT-4o)
- Added GPT-5+ models explicitly in models.json with
"api": "openai-completions"
- Cleared cooldowns and error counts in auth-profiles.json
- Verified the model definitions are correct in both openclaw.json and models.json
Background: The original issue was that gpt-5.3-codex started failing with HTTP 401: Missing scopes: api.responses.write. My OAuth token only has scopes: openid, profile, email, offline_access — it's missing api.responses.write which the Responses API now requires.
Switching to openai-completions fixed GPT-4o, but OpenClaw seems to internally force a different code path for GPT-5+ models that still fails.
Questions:
- Is there a way to force GPT-5+ models to use the Chat Completions API path with OAuth?
- Is there a plan to update the OAuth flow to request the
api.responses.write scope?
- Any workaround to use GPT-5.2 or GPT-4.1 with a ChatGPT Pro subscription?
Relevant config:
- Provider: openai-codex with OAuth
- API set to: openai-completions
- Auth profile type: oauth
- OpenClaw version: 2026.2.24 (df9a474)
- OS: Ubuntu on Vultr VPS
Hi, I'm running OpenClaw 2026.2.24 on a Vultr VPS with Telegram as my main channel.
My auth is ChatGPT Pro via OAuth (openai-codex provider).
The problem: GPT-4o works perfectly when using openai-completions API, but every GPT-5+ model fails immediately with:
Models tested:
What I've tried:
openclaw agents add main(OAuth flow)openai-responsestoopenai-completionsin models.json (this is what fixed GPT-4o)"api": "openai-completions"Background: The original issue was that gpt-5.3-codex started failing with HTTP 401:
Missing scopes: api.responses.write. My OAuth token only has scopes:openid, profile, email, offline_access— it's missingapi.responses.writewhich the Responses API now requires.Switching to openai-completions fixed GPT-4o, but OpenClaw seems to internally force a different code path for GPT-5+ models that still fails.
Questions:
api.responses.writescope?Relevant config: