Skip to content

openclaw models status --probe returns LLM request timed out on 2026.3.8 #41874

@BigApple12138

Description

@BigApple12138

$### Version\n- OpenClaw: 2026.3.8 (3caab92)\n- Install method: npm reinstall/upgrade (user report: upgraded to 3.8 had issues and rolled back manually)\n\n### Command\nbash\nopenclaw models status --agent main --probe\n\n\n### Observed\n- Warning:\n - tools.profile (coding) allowlist contains unknown entries (apply_patch, cron, image). These entries won\x27t match any tool unless the plugin is enabled.\n- Probe result:\n - Model openai-codex/gpt-5.2, profile openai-codex:default (oauth): unknown (~8.4s)\n - Message: LLM request timed out.\n- Auth appears valid (OpenAI OAuth profile present / not expired), but probe request itself times out.\n\n### Expected\n- Probe should succeed or fail with a clearer actionable error, and/or allow configuring a longer probe timeout to avoid false negatives under normal network latency.\n\n### Notes\n- Is the probe timeout configurable?\n- Any known regression in 2026.3.8 around model probing / timeouts?\n

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions