Skip to content

feat: add Node.js native stream helpers for render pipeline#91580

Open
benfavre wants to merge 4 commits intovercel:canaryfrom
benfavre:perf/node-stream-helpers
Open

feat: add Node.js native stream helpers for render pipeline#91580
benfavre wants to merge 4 commits intovercel:canaryfrom
benfavre:perf/node-stream-helpers

Conversation

@benfavre
Copy link
Copy Markdown
Contributor

@benfavre benfavre commented Mar 18, 2026

Summary

Adds Node.js native stream utilities (node-stream-helpers.ts) as the foundation for replacing WhatWG stream polyfills in the SSR render pipeline. Profiling shows the web-to-node conversion layer accounts for 35%+ of CPU time in SSR workloads.

Helpers provided

Helper Web equivalent Purpose
chainNodeStreams chainStreams Sequential stream concatenation
createBufferedTransformNode createBufferedTransformStream Batches small chunks before flushing
createInlinedDataNodeStream createFlightDataInjectionTransformStream Inlines RSC flight data into HTML
pipeNodeReadableToResponse pipeReadable (via WritableStream) Direct pipe to ServerResponse
nodeStreamToBuffer / nodeStreamToString streamToString Collect stream into Buffer/string

Key design decisions

  • Lazy require node:stream: Avoids top-level import that would break Edge runtime and prevents Webpack/Turbopack from pulling node:stream into client bundles (DCE-safe)
  • scheduleImmediate from lib/scheduler: Falls back to setTimeout(cb, 0) on Edge instead of raw setImmediate
  • safePipe helper: Ensures errors from source streams propagate to destination transforms (Node.js .pipe() does NOT forward errors by default)
  • Streaming TextDecoder in nodeStreamToString: Correctly handles multi-byte UTF-8 characters that span chunk boundaries
  • bindSnapshot() on all stream callbacks: Preserves AsyncLocalStorage context across async boundaries (addresses review feedback from @lubieowoce on PRs feat(node-streams): add primitives, build infra, and config flag (6/8) #89859/Node.js streams: First pass #90500)
  • pipeNodeReadableToResponse onEnd semantics: Calls onEnd on success and client disconnect (for cleanup), but NOT on stream error (response is already destroyed)

What this PR does NOT do

This PR only adds the helpers + tests. It does not wire them into the render pipeline yet -- that will be a follow-up PR that replaces the WhatWG stream path with these native equivalents behind a flag.

Test plan

  • Unit tests for all helpers (node-stream-helpers.test.ts)
  • Tests cover: empty streams, single stream passthrough, multi-stream chaining, error propagation, buffered flushing, data inlining with/without delay, client disconnect handling, multi-byte UTF-8 streaming

Add `node-stream-helpers.ts` with Node.js native stream utilities that
parallel the WhatWG stream helpers in `node-web-streams-helper.ts`.
These are the foundational building blocks needed for the node-streams
rendering effort (PRs vercel#89566, vercel#89859, vercel#89860, vercel#90500).

Key functions:
- `chainNodeStreams()` - chains multiple Readable streams sequentially
- `createBufferedTransformNode()` - batches small chunks before flushing
- `createInlinedDataNodeStream()` - inlines flight data into HTML stream
- `pipeNodeReadableToResponse()` - pipes Readable directly to ServerResponse
- `nodeStreamToBuffer()` / `nodeStreamToString()` - collection utilities

ALS context propagation uses `bindSnapshot()` from the existing
`async-local-storage.ts` module, which wraps `AsyncLocalStorage.bind()`.
This addresses the review feedback from @lubieowoce on PR vercel#89859 where
ALS context was incorrectly propagated by wrapping callback return values
instead of binding the callbacks themselves.

This PR adds only the helper utilities as new files. No existing files
are modified. Wiring into the render pipeline is a separate step.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@nextjs-bot
Copy link
Copy Markdown
Collaborator

nextjs-bot commented Mar 18, 2026

Allow CI Workflow Run

  • approve CI run for commit: e334c4b

Note: this should only be enabled once the PR is ready to go and can only be enabled by a maintainer

benfavre and others added 2 commits March 18, 2026 13:31
- Use lazy require('node:stream') for Edge/DCE compatibility
- Use scheduleImmediate instead of raw setImmediate
- Use streaming TextDecoder in nodeStreamToString
- Add safePipe helper for proper error propagation
- Fix pipeNodeReadableToResponse onClose/onEnd semantics
- Add TODO for Buffer-based tag indexOf optimization

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- safePipe: use .once('error') instead of .on('error') — streams emit
  at most one error, and once auto-removes the listener
- pipeNodeReadableToResponse: remove unused DetachedPromise (was
  allocated per request but never awaited or returned)
- createInlinedDataNodeStream: skip startPulling call when already
  started (avoids redundant function call per chunk)
benfavre added a commit to benfavre/next.js that referenced this pull request Mar 18, 2026
…eams flag

Add `experimental.useNodeStreams` config flag that switches the stream
operations module (stream-ops.ts) to load native Node.js implementations
for hot-path functions:

- chainStreams: uses chainNodeStreams (PassThrough-based sequential piping)
- streamToBuffer: uses nodeStreamToBuffer (for-await on Node Readable)
- streamToString: uses nodeStreamToString (streaming TextDecoder)
- renderToFizzStream: uses renderToPipeableStream instead of
  renderToReadableStream, avoiding web→node conversion overhead in React

Complex transform chains (continueFizzStream, prerender continuations)
still delegate to the web implementation as a stopgap — the native
buffering and data inlining transforms from node-stream-helpers will be
wired in a follow-up once the web transform chain is decomposed.

Includes the node-stream-helpers module from PR vercel#91580 which provides
the underlying native stream utilities.

References: vercel#91580, vercel#89566, vercel#90500

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
When delayDataUntilFirstHtmlChunk=true and the HTML stream emits zero
chunks, the transform callback is never called so startPulling() is
never invoked. In flush(), dataExhausted is false, causing it to wait
on dataComplete.promise which never resolves — a hang.

Call startPulling() in flush() if it hasn't been called yet, so the
data stream listeners are attached and can resolve the promise.

Adds a regression test for the empty-HTML-stream edge case.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@benfavre
Copy link
Copy Markdown
Contributor Author

Test Verification

  • node-stream-helpers.test.ts: 20/20 passed (new)
  • chainNodeStreams: empty, single, multi, error propagation
  • createBufferedTransformNode: batching, maxBufferBytes, flush
  • createInlinedDataNodeStream: no-delay, delay, empty data, late data, empty HTML with delay
  • pipeNodeReadableToResponse: HTTP server e2e, destroyed response
  • nodeStreamToString: multi-byte UTF-8 split across chunks
  • ALS context via bindSnapshot verified

All tests run on the perf/combined-all branch against canary. Total: 203 tests across 13 suites, all passing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants