Bug
The /model command reports a successful switch but the actual inference continues using the previous model.
Steps to Reproduce
- Configure multiple models in
agents.defaults.models:
"models": {
"claude-opus-4-5": {},
"openai-codex/codex-1": { "alias": "codex" },
"openai/gpt-5.2": { "alias": "gpt" }
}
-
Run /model codex in Telegram
-
System message appears: "Model switched to codex (openai-codex/codex-1)"
-
Check /status - still shows anthropic/claude-opus-4-5
-
Logs confirm inference uses Opus:
agent model: anthropic/claude-opus-4-5
embedded run start: ... provider=anthropic model=claude-opus-4-5
Additional Issue
Using session_status tool with model parameter returns:
Model "openai-codex/codex-1" is not allowed.
Even though the model is:
- ✅ In config (
agents.defaults.models)
- ✅ Has valid OAuth (verified via
clawdbot models)
- ✅ Recognized by
clawdbot models command
- ✅ Alias works in config
Expected Behavior
After /model codex, subsequent inference should use openai-codex/codex-1.
Environment
- Clawdbot 2026.1.20-2
- macOS
- Channel: Telegram
Bug
The
/modelcommand reports a successful switch but the actual inference continues using the previous model.Steps to Reproduce
agents.defaults.models:Run
/model codexin TelegramSystem message appears: "Model switched to codex (openai-codex/codex-1)"
Check
/status- still showsanthropic/claude-opus-4-5Logs confirm inference uses Opus:
Additional Issue
Using
session_statustool with model parameter returns:Even though the model is:
agents.defaults.models)clawdbot models)clawdbot modelscommandExpected Behavior
After
/model codex, subsequent inference should useopenai-codex/codex-1.Environment