Skip to content

Model switch via /model command reports success but doesn't actually switch #1435

@OolonColoophid

Description

@OolonColoophid

Bug

The /model command reports a successful switch but the actual inference continues using the previous model.

Steps to Reproduce

  1. Configure multiple models in agents.defaults.models:
"models": {
  "claude-opus-4-5": {},
  "openai-codex/codex-1": { "alias": "codex" },
  "openai/gpt-5.2": { "alias": "gpt" }
}
  1. Run /model codex in Telegram

  2. System message appears: "Model switched to codex (openai-codex/codex-1)"

  3. Check /status - still shows anthropic/claude-opus-4-5

  4. Logs confirm inference uses Opus:

agent model: anthropic/claude-opus-4-5
embedded run start: ... provider=anthropic model=claude-opus-4-5

Additional Issue

Using session_status tool with model parameter returns:

Model "openai-codex/codex-1" is not allowed.

Even though the model is:

  • ✅ In config (agents.defaults.models)
  • ✅ Has valid OAuth (verified via clawdbot models)
  • ✅ Recognized by clawdbot models command
  • ✅ Alias works in config

Expected Behavior

After /model codex, subsequent inference should use openai-codex/codex-1.

Environment

  • Clawdbot 2026.1.20-2
  • macOS
  • Channel: Telegram

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions