Skip to content

fix(gateway): filter NO_REPLY messages from chat history endpoint#27258

Closed
Sid-Qin wants to merge 1 commit intoopenclaw:mainfrom
Sid-Qin:fix/27238-chat-history-no-reply-filter
Closed

fix(gateway): filter NO_REPLY messages from chat history endpoint#27258
Sid-Qin wants to merge 1 commit intoopenclaw:mainfrom
Sid-Qin:fix/27238-chat-history-no-reply-filter

Conversation

@Sid-Qin
Copy link
Contributor

@Sid-Qin Sid-Qin commented Feb 26, 2026

Summary

  • Problem: NO_REPLY / [NO_REPLY] sentinel tokens leak into chat history when the non-streaming path stores them as regular assistant messages, causing the model to parrot them back to users.
  • Why it matters: Users see raw [NO_REPLY] text in bot responses, which is confusing and breaks the conversation flow.
  • What changed: Added isSilentReplyText() filtering in sanitizeChatHistoryMessages() to strip silent-reply messages from history before they reach the model, matching the streaming path's existing behavior.
  • What did NOT change (scope boundary): The streaming path, reply suppression logic, and isSilentReplyText() detection are unaffected. Only the history sanitization was missing the filter.

Change Type (select all)

  • Bug fix
  • Feature
  • Refactor
  • Docs
  • Security hardening
  • Chore/infra

Scope (select all touched areas)

  • Gateway / orchestration
  • Skills / tool execution
  • Auth / tokens
  • Memory / storage
  • Integrations
  • API / contracts
  • UI / DX
  • CI/CD / infra

Linked Issue/PR

User-visible / Behavior Changes

NO_REPLY / [NO_REPLY] sentinel tokens no longer appear in bot responses. Silent replies are now consistently filtered from chat history across both streaming and non-streaming paths.

Security Impact (required)

  • New permissions/capabilities: None
  • Auth/token changes: None
  • Data exposure risk: None — prevents leaking internal sentinel tokens into user-visible messages

Testing

  • npx vitest run src/agents/pi-embedded-runner.sanitize-session-history — 18 tests ✓
  • npx vitest run src/auto-reply/reply/ — 479 tests ✓

Rollback Plan

Revert the single commit. No migration or data changes involved.

The streaming path (emitChatDelta/emitChatFinal) already suppresses
assistant messages containing NO_REPLY tokens, but the chat.history
endpoint returned them unfiltered.  When users refresh webchat or open
a new connection, historical NO_REPLY messages appeared in the UI.

Add extractAssistantText() to pull text from various message formats
and filter out matching messages in sanitizeChatHistoryMessages(),
bringing the history path in line with the streaming path.

Closes openclaw#27238
@greptile-apps
Copy link
Contributor

greptile-apps bot commented Feb 26, 2026

Greptile Summary

Filters NO_REPLY (silent reply) assistant messages from the chat.history endpoint response, aligning the history path with the streaming path (emitChatDelta/emitChatFinal) which already suppresses these via isSilentReplyText(). This prevents internal NO_REPLY tokens from appearing in the webchat UI when users refresh or open a new connection.

  • Adds extractAssistantText() helper that safely extracts text from assistant messages across all content formats (string, content-block array, legacy text field)
  • Converts sanitizeChatHistoryMessages() from .map() to a for loop to support dropping messages mid-iteration
  • Non-assistant messages and messages with unextractable text are preserved (safe fallback)

Confidence Score: 5/5

  • This PR is safe to merge — it applies the same well-tested isSilentReplyText() check already used in the streaming path to the history endpoint.
  • Single-file change with minimal scope. The filtering logic reuses the existing isSilentReplyText() function that's already battle-tested in the streaming path. The safe fallback (preserve messages when text extraction fails) prevents accidental data loss. The extractAssistantText helper correctly handles all three message content formats and only targets assistant-role messages.
  • No files require special attention.

Last reviewed commit: 70d0f85

@openclaw-barnacle
Copy link

This pull request has been automatically marked as stale due to inactivity.
Please add updates or it will be closed.

@openclaw-barnacle openclaw-barnacle bot added the stale Marked as stale due to inactivity label Mar 4, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

app: web-ui App: web-ui gateway Gateway runtime size: XS stale Marked as stale due to inactivity

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Bug: NO_REPLY tokens leak in chat history endpoint (PR #16286 incomplete fix)

2 participants