Skip to content

fix(chat): separate response and thinking messages#878

Merged
su8su merged 3 commits intomainfrom
fix/seperate_response_and_thinking_message
Apr 20, 2026
Merged

fix(chat): separate response and thinking messages#878
su8su merged 3 commits intomainfrom
fix/seperate_response_and_thinking_message

Conversation

@hazeone
Copy link
Copy Markdown
Contributor

@hazeone hazeone commented Apr 20, 2026

Summary

separate response and thinking messages

Type of Change

  • Bug fix
  • New feature
  • Documentation
  • Refactor
  • Other

Validation

Checklist

  • I ran relevant checks/tests locally.
  • I updated docs if behavior or interfaces changed.
  • I verified there are no unrelated changes in this PR.

hazeone and others added 2 commits April 20, 2026 11:16
…ol calls

- Detect server-side tool execution via historical messages (hasCompletedToolPhase)
  when Gateway doesn't send streaming tool events to the client
- Prevent execution graph from auto-collapsing while reply is still streaming
  by keeping expanded=true and excluding from autoCollapsedRunKeys
- Omit streaming thinking segments from execution graph when reply renders
  as a separate bubble to avoid duplication

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@hazeone hazeone marked this pull request as ready for review April 20, 2026 05:58
Copy link
Copy Markdown
Contributor

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 614802aa03

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment thread src/pages/Chat/index.tsx Outdated
Comment on lines +385 to +386
const shouldCollapse = !isStillStreaming
&& (card.replyIndex != null && replyTextOverrides.has(card.replyIndex));
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Restore post-stream auto-collapse for completed runs

shouldCollapse now excludes runs that just finished streaming unless a replyTextOverrides entry exists. During streaming we force expanded=true, so ExecutionGraphCard stays in controlled mode and its internal uncontrolledExpanded never syncs to active=false; when streaming ends and control is removed, cards without a reply override remain expanded permanently. This is visible on tool runs whose final reply text doesn't need stripping, and it contradicts the intended “collapse after completion” behavior.

Useful? React with 👍 / 👎.

@su8su su8su merged commit 7fa4852 into main Apr 20, 2026
8 of 9 checks passed
@su8su su8su deleted the fix/seperate_response_and_thinking_message branch April 20, 2026 07:22
DigitalNomad-Chat added a commit to DigitalNomad-Chat/ClawX that referenced this pull request Apr 26, 2026
… dedupe (ValueCell-ai#821 ValueCell-ai#845 ValueCell-ai#870 ValueCell-ai#873 ValueCell-ai#875 ValueCell-ai#878 ValueCell-ai#880 ValueCell-ai#885 ValueCell-ai#886 ValueCell-ai#887 ValueCell-ai#891 ValueCell-ai#903)

Overhaul execution graph card (collapse/expand, narration steps, web_fetch links),
separate thinking messages, render LaTeX math, dedupe optimistic messages,
hide recoverable gateway timeouts, add startup history recovery.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants