Skip to content

[Bug]: 2026.4.14 Windows chat UI regression: input text swallowed, streamed replies often invisible until refresh, typing indicator flashes then blanks #67035

@q7793527

Description

@q7793527

Description

After upgrading to OpenClaw 2026.4.14 on Windows, the chat experience regressed badly in the web/dashboard UI.

This does not look like a simple slow-model issue. The main visible failures are in input rendering and streamed reply rendering:

  • Typed user input often does not appear immediately in the input/chat surface.
  • Sometimes part of the typed input appears much later.
  • Sometimes user input appears to be dropped entirely.
  • Assistant output is no longer reliably visible as a live stream.
  • In many cases the reply only becomes visible after manually refreshing the page.
  • The normal "assistant is typing" animated dots often flash once and then disappear, leaving a blank area.
  • Sometimes the UI briefly shows activity and then turns blank with no visible streamed content.

Expected behavior

  • User input should appear immediately while typing / after send.
  • Assistant output should stream live without requiring a manual refresh.
  • Typing indicator should remain visible while the assistant is generating.
  • No user-entered text should be lost.

Actual behavior

  • Input text is visually swallowed / delayed / occasionally lost.
  • Streamed output often fails to render live.
  • Refreshing the page can reveal content that was not shown in real time.
  • Typing indicator is unstable and often disappears too early.

Why this seems like a regression

  • This was noticeably better on the previous version in the same environment.
  • We also tested configuration changes around active-memory and model assignment, but the problem persisted.
  • In particular, active-memory worked acceptably on the previous version and became much more problematic on this version.
  • Because input rendering itself is affected, this appears more like a frontend/render/update regression than just model latency.

Environment

  • OpenClaw version: 2026.4.14 (323493f)
  • OS: Windows 10 Pro 22H2 (10.0.19045, build 19045, 64-bit)
  • Install/update path: upgraded to the official 2026.4.14 release
  • Local model setup: LM Studio is enabled and used as the primary model path in this installation

Steps to reproduce

  1. Upgrade/start OpenClaw 2026.4.14 on Windows.
  2. Open the OpenClaw chat/dashboard UI.
  3. Send normal short messages such as a greeting.
  4. Observe the input field/chat timeline while typing and immediately after send.
  5. Observe assistant streaming behavior during generation.
  6. Refresh the page and compare what becomes visible only after refresh.

What we observed repeatedly

  • Sending a simple greeting can result in the typed text not appearing immediately in the conversation.
  • The assistant may begin to respond internally, but the UI only flashes the typing indicator briefly and then shows blank space.
  • A manual refresh can reveal content that was not rendered live.

Suspected area

Possibly a regression in one or more of:

  • chat input state synchronization
  • websocket/event-stream to UI reconciliation
  • streamed token rendering
  • active-memory related UI/update path
  • optimistic message insertion / pending message replacement

Additional notes

  • Please treat this as a UX-blocking regression: it makes normal conversation feel unreliable even when the backend may still be doing work.
  • No secrets, tokens, or personal paths included here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions