Skip to content

[Bug]: TUI/webchat can stay in pondering after codex/gpt-5.4 has already finished the turn #66470

@AltamimiYasser

Description

@AltamimiYasser

Bug type

Crash (process/app exits or hangs)

Beta release blocker

No

Summary

With codex/gpt-5.4 on OpenClaw 2026.4.12, the backend can finish and write the assistant reply, but TUI/webchat can remain in pondering for tens of seconds before rendering the final response.

Steps to reproduce

  1. Start OpenClaw 2026.4.12 with codex/gpt-5.4 as the default model.
  2. Open a fresh TUI or webchat session.
  3. Send a trivial prompt such as testing, wait for the first reply, then send a second trivial prompt.
  4. Observe that the UI can remain in pondering for about 39 seconds after the backend has already appended the assistant reply to the session file.

Expected behavior

When the backend finishes a Codex turn, the TUI/webchat should render the final reply immediately instead of continuing to show pondering.

Actual behavior

In live PTY testing, the assistant reply was already written to the session file, but the TUI continued showing pondering for about 39 seconds before I interrupted it. On the same install, gateway startup also logged startup model warmup failed for codex/gpt-5.4: Error: Unknown model: codex/gpt-5.4.

OpenClaw version

2026.4.12

Operating system

Ubuntu 24.04.4 LTS on WSL

Install method

npm global

Model

codex/gpt-5.4

Provider / routing chain

openclaw -> codex app-server -> codex/gpt-5.4

Additional provider/model setup details

No response

Logs, screenshots, and evidence

Observed evidence from local testing:

- Gateway startup log: `startup model warmup failed for codex/gpt-5.4: Error: Unknown model: codex/gpt-5.4`
- In a live Codex TUI test, the backend had already written the assistant reply to the session file, but the frontend still showed `pondering` for about 39 seconds.

Relevant local session file:
- `~/.openclaw/agents/main/sessions/sessionID.jsonl`

Relevant local gateway journal:
- `journalctl --user -u openclaw-gateway`

Impact and severity

Affected: TUI/webchat users running codex/gpt-5.4 on OpenClaw 2026.4.12
Severity: High
Frequency: Reproduced across multiple fresh Codex sessions before local patching
Consequence: Interactive Codex sessions appear hung or much slower than the backend completion time

Additional information

A local runtime patch that registered the started agent run id in chat.send removed the delayed terminal event behavior in my install. A separate local patch that stopped skipping provider runtime hooks removed the Unknown model: codex/gpt-5.4 warmup log on startup

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingbug:crashProcess/app exits unexpectedly or hangs

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions