feat(chat): add Claude /btw side-question#138
Conversation
bbsngg
left a comment
There was a problem hiding this comment.
PR Review: /btw side-question feature
Overall the idea is solid and the UX is well thought out. A few issues worth addressing before merge:
🔴 Critical
1. Race condition with process.env mutation (server/claude-sdk.js:818-827)
const prevStreamTimeout = process.env.CLAUDE_CODE_STREAM_CLOSE_TIMEOUT;
process.env.CLAUDE_CODE_STREAM_CLOSE_TIMEOUT = '120000';
const queryInstance = query({ ... });
// immediately restores env varThis is not safe under concurrent requests — two simultaneous /btw calls will stomp on each other's env var. The restore also happens before the stream is consumed (for await), so the env var may already be reset by the time it's actually needed. Consider passing this as a config option instead of mutating process.env.
2. Textarea disabled removed entirely (ChatComposer.tsx:381)
disabled={isLoading} was removed from the textarea entirely (along with disabled:opacity-50 in the className). This means during normal streaming the user can type freely and the textarea is no longer visually dimmed. The intent was to allow / commands while streaming, but the fix is too broad — consider keeping disabled and only conditionally allowing input when the text starts with /.
🟡 Should fix
3. Submit button allows any /-prefixed text during streaming (ChatComposer.tsx:421-424)
(isLoading && !input.trim().startsWith('/'))This lets the user submit any slash command while streaming (e.g. /clear, /rewind), not just /btw. Other commands probably shouldn't be executable mid-stream.
4. handleSubmit guard reordering (useChatComposerState.ts:922-952)
The isLoading check was moved after slash command processing. If a non-/btw slash command is entered during streaming, it won't be caught by the /btw early-return and will fall through past the isLoading guard, potentially triggering a normal message send.
5. Transcript truncation cuts mid-message (useChatComposerState.ts:197-200)
out.slice(out.length - BTW_TRANSCRIPT_MAX_CHARS) can cut a message in half. Should find the next \n\n boundary after the cut point to preserve message integrity.
6. No request abort on overlay close
If the user closes the overlay while the API call is in flight, the fetch continues and the response is silently discarded. Should use AbortController to cancel the request on close.
💡 Minor
- No server-side size limit on
questionfield — a very long question + 120K transcript could create an oversized prompt. BtwOverlayis rendered insideChatComposerbut is semantically unrelated — consider rendering atChatInterfacelevel.CLOSED_BTW_OVERLAYdefault shape is duplicated incloseBtwOverlaycallback anduseStateinitializer — could reuse the constant.
✅ What looks good
- Clean overlay UI with Escape + backdrop click-to-close
- Proper
aria-modalandrole="dialog"accessibility - Non-Claude providers get a clear fallback message
- Transcript builder correctly filters to user/assistant messages
- Server-side input validation on the route
Critical fixes:
- Move process.env restore into finally block so stream timeout env
var persists through full stream consumption, eliminating the race
condition under concurrent requests.
- Fix textarea disabled catch-22: remove disabled attr (which blocked
typing /btw entirely), use conditional opacity-50 class instead.
- Fix premature server-side abort: switch from req.on('close') (fires
when json middleware drains body) to res.on('close') + writableFinished
guard so abort only fires on real client disconnect.
Guard logic:
- Only allow /btw during streaming in submit button, handleSubmit guard,
and textarea; all other slash commands blocked when isLoading.
- Add isLoading && commandName !== '/btw' check before executeCommand.
Abort & cleanup:
- Add AbortController to btw fetch; abort on overlay close; catch block
silently returns when aborted.
- Server-side: pass signal to runClaudeBtw, check signal.aborted in
stream loop, catch, and post-loop; res.headersSent guards on response.
Minor:
- Server-side size limits: question 2K chars, transcript 150K chars.
- Move BtwOverlay from ChatComposer to ChatInterface level.
- Extract shared CLOSED_BTW_OVERLAY constant.
- Stabilize getChatMessagesForBtw with useRef + useCallback to avoid
unnecessary executeCommand re-creation. Consolidate duplicate ref.
- Transcript truncation snaps to next \n\n boundary.
- Remove stray blank lines in ChatComposer.
Made-with: Cursor
3922a67 to
ce85e72
Compare
|
All review items addressed in ce85e72. Here's the point-by-point resolution: 🔴 Critical1. Race condition with 2. Textarea 🟡 Should fix3. Submit button allows any 4. 5. Transcript truncation cuts mid-message — Now finds the next 6. No request abort on overlay close — Added 💡 Minor
|
Summary
Adds a Claude Code–style
/btwside question: one-shot, tool-free answer in an overlay from current chat context, without appending to the main transcript. Only when the chat provider is Claude.Changes
runClaudeBtw+POST /api/claude/btw/btwin commands APIBtwOverlayUI; slash commands can run while the model is streaming (textarea not disabled for/…)Notes