Request
Please add openai-codex/gpt-5.2 to the xhigh thinking whitelist (XHIGH_MODEL_REFS).
Context
In dist/auto-reply/thinking.js, xhigh is currently restricted to a small hard-coded set (e.g. openai/gpt-5.2, openai-codex/gpt-5.2-codex, openai-codex/gpt-5.1-codex).
However, many setups (including ours) use the OpenAI Codex provider and run the default/allowed model as openai-codex/gpt-5.2 (alias gpt52). When setting thinking="xhigh", we hit:
Error: Thinking level "xhigh" is only supported for openai/gpt-5.2, openai-codex/gpt-5.2-codex or openai-codex/gpt-5.1-codex.
Proposal
Add openai-codex/gpt-5.2 to the allowlist so users on the codex provider can use xhigh with the base gpt-5.2 model (not only the -codex variant).
Why this matters
- Keeps behavior consistent across providers (openai vs openai-codex).
- Avoids confusing configuration loops where
gpt52 works but xhigh is rejected.
Thanks!
Request
Please add
openai-codex/gpt-5.2to the xhigh thinking whitelist (XHIGH_MODEL_REFS).Context
In
dist/auto-reply/thinking.js, xhigh is currently restricted to a small hard-coded set (e.g.openai/gpt-5.2,openai-codex/gpt-5.2-codex,openai-codex/gpt-5.1-codex).However, many setups (including ours) use the OpenAI Codex provider and run the default/allowed model as
openai-codex/gpt-5.2(aliasgpt52). When settingthinking="xhigh", we hit:Proposal
Add
openai-codex/gpt-5.2to the allowlist so users on the codex provider can use xhigh with the base gpt-5.2 model (not only the -codex variant).Why this matters
gpt52works butxhighis rejected.Thanks!