Skip to content

Conversation

@unnoq
Copy link
Member

@unnoq unnoq commented Nov 11, 2025

Previously, the query remained in a pending/undefined state until the first chunk was yielded. Now, after successfully resolving the stream promise, the query is immediately set to an empty array (success state) before iterating through chunks.

Summary by CodeRabbit

  • New Features

    • Enhanced streaming query refetch modes with intelligent cache handling (reset, append, replace options)
    • Improved chunk limit enforcement during parallel streaming scenarios
  • Bug Fixes

    • Refined abort signal handling to ensure proper cleanup during streaming operations
    • Fixed cache state management during edge cases and error conditions
  • Tests

    • Expanded test coverage for streaming configurations and refetch behaviors
    • Added comprehensive tests for abort handling and stream edge cases

…solves

Previously, the query remained in a pending/undefined state until the first
chunk was yielded. Now, after successfully resolving the stream promise, the
query is immediately set to an empty array (success state) before iterating
through chunks.
@vercel
Copy link

vercel bot commented Nov 11, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
orpc Ready Ready Preview Comment Nov 11, 2025 3:34am

@dosubot dosubot bot added the size:XL This PR changes 500-999 lines, ignoring generated files. label Nov 11, 2025
@coderabbitai
Copy link

coderabbitai bot commented Nov 11, 2025

Walkthrough

The stream-query implementation is refactored to introduce refetch mode handling with conditional cache management. The test suite expands significantly to validate various streaming scenarios including refetch modes, chunk limiting, abort handling, and edge cases.

Changes

Cohort / File(s) Summary
Core streaming implementation
packages/tanstack-query/src/stream-query.ts
Renames isRefetch to hasPreviousData; introduces shouldUpdateCacheDuringStream to control incremental cache updates during streaming; adds size-limited cache initialization; replaces addToEnd helper with limitArraySize; refetchMode 'reset' clears and re-fetches when previous data exists; other modes preserve or append cache depending on configuration.
Streaming test suite expansion
packages/tanstack-query/src/stream-query.test.ts
Replaces single basic test with comprehensive multi-scenario suite; adds tests for refetchMode options (reset, append, replace), maxChunks handling across parallel/pre-populated scenarios, abort signal handling with cleanup verification, and edge cases (mid-stream cache reset, error handling); introduces sleep utility import from @orpc/shared for deterministic timing.

Sequence Diagram

sequenceDiagram
    participant Caller
    participant StreamQuery
    participant Cache
    participant DataStream

    Caller->>StreamQuery: streamQuery(queryFn, options)
    
    alt hasPreviousData && refetchMode === 'reset'
        StreamQuery->>Cache: clear cache
    else
        StreamQuery->>Cache: initialize with limited empty array
    end

    StreamQuery->>DataStream: subscribe to stream
    
    loop for each chunk
        DataStream->>StreamQuery: chunk arrives
        StreamQuery->>StreamQuery: apply limitArraySize
        
        alt shouldUpdateCacheDuringStream
            StreamQuery->>Cache: update cache incrementally
        else
            StreamQuery->>StreamQuery: accumulate in local result
        end
    end

    DataStream->>StreamQuery: stream resolves
    
    alt !shouldUpdateCacheDuringStream
        StreamQuery->>Cache: finalize with accumulated result
    end

    alt cached result exists
        StreamQuery->>Caller: return size-limited cached data
    else
        StreamQuery->>Caller: return accumulated result
    end
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

  • Test suite expansion: Review logic for each refetchMode scenario, maxChunks enforcement across parallel/sequential cases, and abort cleanup verification
  • Cache update logic: Verify conditional branching for shouldUpdateCacheDuringStream, size limiting with limitArraySize, and final cache state correctness
  • Refetch mode semantics: Confirm 'reset' behavior clears cache while append/replace modes preserve existing data appropriately

Poem

🐰 A streaming query blooms anew,
With modes and chunks in every hue,
Cache resets, appends, limits grow,
While tests ensure the data flow. 🌊
Aborts handled, edges mend—
A rabbit's joy knows no end! ✨

Pre-merge checks and finishing touches

❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 50.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The PR title accurately describes the main change: setting the stream query to success immediately after stream resolves, which aligns with the core objective and implementation changes.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch fix/tanstack-query/set-query-to-success-immediately-after-stream-resolves

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @unnoq, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request addresses a specific behavior in tanstack-query's experimental streamed query functionality. Previously, streamed queries would remain in a pending state until the first data chunk was received. This change modifies the streamedQuery implementation to transition the query to a successful state with an empty array immediately after the stream's initial promise resolves, improving the perceived responsiveness and state management. Concurrently, the associated test files have undergone a major overhaul to provide more comprehensive and granular testing for various refetching modes, chunk limiting, and edge cases, ensuring the new behavior is robust and correctly implemented.

Highlights

  • Immediate Success State for Streamed Queries: The primary change ensures that experimental_serializableStreamedQuery immediately sets the query to a success state with an empty array after the stream promise resolves, preventing it from remaining in a pending/undefined state until the first chunk arrives.
  • Refactored and Expanded Test Suite: The test suite for streamedQuery has been significantly refactored and expanded, introducing new describe blocks for refetchMode option, maxChunks option, abort signal handling, and edge cases. This provides more robust and detailed coverage for various streaming and caching scenarios.
  • Improved refetchMode and maxChunks Logic: The internal logic for handling refetchMode (reset, append, replace) and maxChunks has been refined to ensure consistent and correct cache updates, especially during parallel operations and when previous data exists.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request fixes an issue where a streamed query would remain in a pending state until the first chunk arrived. The change correctly sets the query to a success state with an empty array immediately after the stream promise resolves. The accompanying tests have been significantly improved, with more granular checks and better organization, providing much stronger validation for the streaming logic. I've found one area for a minor refactoring to remove a redundant operation, but overall the changes are excellent and improve both correctness and test quality.

@unnoq unnoq changed the title fix(tanstack-query): set query to success immediately after stream resolves fix(tanstack-query): set stream query to success immediately after stream resolves Nov 11, 2025
@codecov
Copy link

codecov bot commented Nov 11, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.

📢 Thoughts on this report? Let us know!

@pkg-pr-new
Copy link

pkg-pr-new bot commented Nov 11, 2025

More templates

@orpc/ai-sdk

npm i https://pkg.pr.new/@orpc/ai-sdk@1202

@orpc/arktype

npm i https://pkg.pr.new/@orpc/arktype@1202

@orpc/client

npm i https://pkg.pr.new/@orpc/client@1202

@orpc/contract

npm i https://pkg.pr.new/@orpc/contract@1202

@orpc/experimental-durable-iterator

npm i https://pkg.pr.new/@orpc/experimental-durable-iterator@1202

@orpc/hey-api

npm i https://pkg.pr.new/@orpc/hey-api@1202

@orpc/interop

npm i https://pkg.pr.new/@orpc/interop@1202

@orpc/json-schema

npm i https://pkg.pr.new/@orpc/json-schema@1202

@orpc/nest

npm i https://pkg.pr.new/@orpc/nest@1202

@orpc/openapi

npm i https://pkg.pr.new/@orpc/openapi@1202

@orpc/openapi-client

npm i https://pkg.pr.new/@orpc/openapi-client@1202

@orpc/otel

npm i https://pkg.pr.new/@orpc/otel@1202

@orpc/experimental-pino

npm i https://pkg.pr.new/@orpc/experimental-pino@1202

@orpc/experimental-publisher

npm i https://pkg.pr.new/@orpc/experimental-publisher@1202

@orpc/experimental-ratelimit

npm i https://pkg.pr.new/@orpc/experimental-ratelimit@1202

@orpc/react

npm i https://pkg.pr.new/@orpc/react@1202

@orpc/react-query

npm i https://pkg.pr.new/@orpc/react-query@1202

@orpc/experimental-react-swr

npm i https://pkg.pr.new/@orpc/experimental-react-swr@1202

@orpc/server

npm i https://pkg.pr.new/@orpc/server@1202

@orpc/shared

npm i https://pkg.pr.new/@orpc/shared@1202

@orpc/solid-query

npm i https://pkg.pr.new/@orpc/solid-query@1202

@orpc/standard-server

npm i https://pkg.pr.new/@orpc/standard-server@1202

@orpc/standard-server-aws-lambda

npm i https://pkg.pr.new/@orpc/standard-server-aws-lambda@1202

@orpc/standard-server-fastify

npm i https://pkg.pr.new/@orpc/standard-server-fastify@1202

@orpc/standard-server-fetch

npm i https://pkg.pr.new/@orpc/standard-server-fetch@1202

@orpc/standard-server-node

npm i https://pkg.pr.new/@orpc/standard-server-node@1202

@orpc/standard-server-peer

npm i https://pkg.pr.new/@orpc/standard-server-peer@1202

@orpc/svelte-query

npm i https://pkg.pr.new/@orpc/svelte-query@1202

@orpc/tanstack-query

npm i https://pkg.pr.new/@orpc/tanstack-query@1202

@orpc/trpc

npm i https://pkg.pr.new/@orpc/trpc@1202

@orpc/valibot

npm i https://pkg.pr.new/@orpc/valibot@1202

@orpc/vue-colada

npm i https://pkg.pr.new/@orpc/vue-colada@1202

@orpc/vue-query

npm i https://pkg.pr.new/@orpc/vue-query@1202

@orpc/zod

npm i https://pkg.pr.new/@orpc/zod@1202

commit: 8749cd5

@unnoq unnoq added the lgtm This PR has been approved by a maintainer label Nov 11, 2025
@unnoq unnoq merged commit bbe55b7 into main Nov 12, 2025
11 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

lgtm This PR has been approved by a maintainer size:XL This PR changes 500-999 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants