Bug type
Behavior bug (incorrect output/state without crash)
Summary
Description:
The Control UI Chat tab shows a large warning triangle that covers the chat input textarea when the session's inputTokens exceeds contextTokens. The textarea exists in the DOM (.agent-chat__input > textarea) but is visually hidden behind the overlay. Opening DevTools triggers a reflow that makes the input visible again.
Additionally, there's a token count mismatch between views:
Session data shows inputTokens: 157,556 vs contextTokens: 128,000 (over limit)
Sessions list shows totalTokens: 22,838 (well under limit)
User reported seeing "215.2k / 128k" in one view and "27,145 / 128,000" in another
Expected behavior:
Steps to reproduce
Use a local model with 128k context (e.g. ollama/glm-4.7-flash)
Accumulate conversation history until inputTokens exceeds contextTokens
Open Chat tab → warning triangle covers input, no error text
Expected behavior
It is sufficient to set disable = true on the inputfield and extending the token count with an clear error message (e.g. "Context limit exceeded — compact session to continue") and a "Compact Session" button should also be available.
Token count should be consistent across views (or clearly labeled which metric is shown)
Workaround: Open DevTools (F12) to trigger reflow, or use Discord channel instead.
Actual behavior
No information about the error is visible
OpenClaw version
2026.3.12
Operating system
Windows 11
Install method
npm global
Model
ollama/glm-4.7-flash
Provider / routing chain
openclaw gateway loopback
Config file / key location
No response
Additional provider/model setup details
No response
Logs, screenshots, and evidence
Impact and severity
Webcontrol App Users
Severity: irritating
Additional information
No response
Bug type
Behavior bug (incorrect output/state without crash)
Summary
Description:
The Control UI Chat tab shows a large warning triangle that covers the chat input textarea when the session's inputTokens exceeds contextTokens. The textarea exists in the DOM (.agent-chat__input > textarea) but is visually hidden behind the overlay. Opening DevTools triggers a reflow that makes the input visible again.
Additionally, there's a token count mismatch between views:
Session data shows inputTokens: 157,556 vs contextTokens: 128,000 (over limit)
Sessions list shows totalTokens: 22,838 (well under limit)
User reported seeing "215.2k / 128k" in one view and "27,145 / 128,000" in another
Expected behavior:
Steps to reproduce
Use a local model with 128k context (e.g. ollama/glm-4.7-flash)
Accumulate conversation history until inputTokens exceeds contextTokens
Open Chat tab → warning triangle covers input, no error text
Expected behavior
It is sufficient to set disable = true on the inputfield and extending the token count with an clear error message (e.g. "Context limit exceeded — compact session to continue") and a "Compact Session" button should also be available.
Token count should be consistent across views (or clearly labeled which metric is shown)
Workaround: Open DevTools (F12) to trigger reflow, or use Discord channel instead.
Actual behavior
No information about the error is visible
OpenClaw version
2026.3.12
Operating system
Windows 11
Install method
npm global
Model
ollama/glm-4.7-flash
Provider / routing chain
openclaw gateway loopback
Config file / key location
No response
Additional provider/model setup details
No response
Logs, screenshots, and evidence
Impact and severity
Webcontrol App Users
Severity: irritating
Additional information
No response