Skip to content

feat: improve function return value tracking#6065

Open
cyyynthia wants to merge 8 commits intorollup:masterfrom
cyyynthia:function-return-values
Open

feat: improve function return value tracking#6065
cyyynthia wants to merge 8 commits intorollup:masterfrom
cyyynthia:function-return-values

Conversation

@cyyynthia
Copy link
Copy Markdown
Contributor

@cyyynthia cyyynthia commented Aug 9, 2025

This PR contains:

  • bugfix
  • feature
  • refactor
  • documentation
  • other

Are tests included?

  • yes (bugfixes and features will not be merged without tests)
  • no

Breaking Changes?

  • yes (breaking changes will not be merged unless absolutely necessary)
  • no

List any relevant issue numbers:

Description

This PR improves the way return values are resolved inside function calls. It does so via 3 changes:

  • The logic for adding the implicit return in blocks has been tweaked to better detect cases where we're guaranteed to return a value. The implicit return is now an undefined expression rather than an unknown expression, enabling more optimizations to be performed.
  • When a function has multiple return values, they are all combined using the existing MultiExpression node. It has been expanded to allow resolving literal values if all paths agree on a specific value, trying to resolve its truthiness if there are different values being returned.
  • When evaluating the possible return values via MultiExpression, the reachability of the return statement is verified so that it can be ignored if we are certain the code path is unreachable. As treeshaking information is not fully available at the time literals are being resolved, a lightweight reachability check has been implemented to provide the necessary information.

@vercel
Copy link
Copy Markdown

vercel bot commented Aug 9, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
rollup Ready Ready Preview Comment Sep 2, 2025 2:23pm

@codecov
Copy link
Copy Markdown

codecov bot commented Aug 9, 2025

Codecov Report

❌ Patch coverage is 99.10714% with 1 line in your changes missing coverage. Please review.
✅ Project coverage is 98.78%. Comparing base (592e7d7) to head (e0d5ce0).
⚠️ Report is 27 commits behind head on master.

Files with missing lines Patch % Lines
src/ast/nodes/IfStatement.ts 90.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #6065      +/-   ##
==========================================
- Coverage   98.79%   98.78%   -0.02%     
==========================================
  Files         271      271              
  Lines       10622    10699      +77     
  Branches     2840     2870      +30     
==========================================
+ Hits        10494    10569      +75     
- Misses         88       90       +2     
  Partials       40       40              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@github-actions
Copy link
Copy Markdown

github-actions bot commented Aug 27, 2025

Thank you for your contribution! ❤️

You can try out this pull request locally by installing Rollup via

npm install cyyynthia/rollup#function-return-values

Notice: Ensure you have installed the latest nightly Rust toolchain. If you haven't installed it yet, please see https://www.rust-lang.org/tools/install to learn how to download Rustup and install Rust.

or load it into the REPL:
https://rollup-mktm8ybza-rollup-js.vercel.app/repl/?pr=6065

@github-actions
Copy link
Copy Markdown

github-actions bot commented Aug 27, 2025

Performance report

  • BUILD: 6959ms, 837 MB
    • initialize: 0ms, 25.1 MB
    • generate module graph: 2596ms, 632 MB
      • generate ast: 1359ms, 620 MB
    • sort and bind modules: 410ms (+11ms, +2.7%), 692 MB
    • mark included statements: 3953ms, 837 MB
      • treeshaking pass 1: 2330ms, 832 MB
      • treeshaking pass 2: 461ms, 824 MB
      • treeshaking pass 3: 394ms, 838 MB
      • treeshaking pass 4: 383ms, 818 MB
      • treeshaking pass 5: 379ms, 837 MB
  • GENERATE: 706ms (-229ms, -24.5%), 923 MB
    • initialize render: 0ms, 847 MB (-11%)
    • generate chunks: 212ms (+113ms, +113.0%), 838 MB (-10%)
      • optimize chunks: 0ms, 825 MB (-10%)
    • render chunks: 627ms (-219ms, -25.9%), 898 MB
    • transform chunks: 20ms, 923 MB
    • generate bundle: 0ms, 923 MB

Copy link
Copy Markdown
Member

@lukastaegert lukastaegert left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR does a lot of things, and the concept of locallyReachable makes me a little uneasy. Also, I do not fully understand this concept, is this meant as a "there is a chance this node might be included at some point"?
If, for instance, we would just take the literal value of all included return statements into account and then deoptimize if another return statement is included, would this deoptimize too often because the value is queried too early before any return statement is included?

Another point is that so far, I have avoided to track multiple return expressions out of fear of a performance regression. Below is a (completely artificial) example that demonstrates this. At least for me, this will basically make the browser hang for a long time. Removing the pr=6065& from the URL, i.e. using production Rollup, will make it load quickly:

https://rollup-c7a3llkmm-rollup-js.vercel.app/repl/?pr=6065&shareable=JTdCJTIyZXhhbXBsZSUyMiUzQW51bGwlMkMlMjJtb2R1bGVzJTIyJTNBJTVCJTdCJTIyY29kZSUyMiUzQSUyMmZ1bmN0aW9uJTIwYSgpJTIwJTdCJTVDbiUyMCUyMGNvbnNvbGUubG9nKCdhJyklNUNuJTIwJTIwaWYlMjAoTWF0aC5yYW5kb20oKSUyMCUzRSUyMDAuNSklMjByZXR1cm4lMjBiKDEpJTVDbiUyMCUyMGlmJTIwKE1hdGgucmFuZG9tKCklMjAlM0UlMjAwLjUpJTIwcmV0dXJuJTIwYigyKSU1Q24lMjAlMjByZXR1cm4lMjBiKDMpJTVDbiU3RCU1Q24lNUNuZnVuY3Rpb24lMjBiKHgpJTIwJTdCJTVDbiUyMCUyMGNvbnNvbGUubG9nKCdiJyUyQyUyMHgpJTVDbiUyMCUyMGlmJTIwKE1hdGgucmFuZG9tKCklMjAlM0UlMjAwLjUpJTIwcmV0dXJuJTIwYygxKSU1Q24lMjAlMjBpZiUyMChNYXRoLnJhbmRvbSgpJTIwJTNFJTIwMC41KSUyMHJldHVybiUyMGMoMiklNUNuJTIwJTIwcmV0dXJuJTIwYygzKSU1Q24lN0QlNUNuJTVDbmZ1bmN0aW9uJTIwYyh4KSUyMCU3QiU1Q24lMjAlMjBjb25zb2xlLmxvZygnYyclMkMlMjB4KSU1Q24lMjAlMjBpZiUyMChNYXRoLnJhbmRvbSgpJTIwJTNFJTIwMC41KSUyMHJldHVybiUyMGQoMSklNUNuJTIwJTIwaWYlMjAoTWF0aC5yYW5kb20oKSUyMCUzRSUyMDAuNSklMjByZXR1cm4lMjBkKDIpJTVDbiUyMCUyMHJldHVybiUyMGQoMyklNUNuJTdEJTVDbiU1Q25mdW5jdGlvbiUyMGQoeCklMjAlN0IlNUNuJTIwJTIwY29uc29sZS5sb2coJ2QnJTJDJTIweCklNUNuJTIwJTIwaWYlMjAoTWF0aC5yYW5kb20oKSUyMCUzRSUyMDAuNSklMjByZXR1cm4lMjBlKDEpJTVDbiUyMCUyMGlmJTIwKE1hdGgucmFuZG9tKCklMjAlM0UlMjAwLjUpJTIwcmV0dXJuJTIwZSgyKSU1Q24lMjAlMjByZXR1cm4lMjBlKDMpJTVDbiU3RCU1Q24lNUNuZnVuY3Rpb24lMjBlKHgpJTIwJTdCJTVDbiUyMCUyMGNvbnNvbGUubG9nKCdlJyUyQyUyMHgpJTVDbiUyMCUyMGlmJTIwKE1hdGgucmFuZG9tKCklMjAlM0UlMjAwLjUpJTIwcmV0dXJuJTIwZigxKSU1Q24lMjAlMjBpZiUyMChNYXRoLnJhbmRvbSgpJTIwJTNFJTIwMC41KSUyMHJldHVybiUyMGYoMiklNUNuJTIwJTIwcmV0dXJuJTIwZigzKSU1Q24lN0QlNUNuJTVDbmZ1bmN0aW9uJTIwZih4KSUyMCU3QiU1Q24lMjAlMjBjb25zb2xlLmxvZygnZiclMkMlMjB4KSU1Q24lMjAlMjBpZiUyMChNYXRoLnJhbmRvbSgpJTIwJTNFJTIwMC41KSUyMHJldHVybiUyMGcoMSklNUNuJTIwJTIwaWYlMjAoTWF0aC5yYW5kb20oKSUyMCUzRSUyMDAuNSklMjByZXR1cm4lMjBnKDIpJTVDbiUyMCUyMHJldHVybiUyMGcoMyklNUNuJTdEJTVDbiU1Q25mdW5jdGlvbiUyMGcoeCklMjAlN0IlNUNuJTIwJTIwY29uc29sZS5sb2coJ2cnJTJDJTIweCklNUNuJTIwJTIwaWYlMjAoTWF0aC5yYW5kb20oKSUyMCUzRSUyMDAuNSklMjByZXR1cm4lMjBoKDEpJTVDbiUyMCUyMGlmJTIwKE1hdGgucmFuZG9tKCklMjAlM0UlMjAwLjUpJTIwcmV0dXJuJTIwaCgyKSU1Q24lMjAlMjByZXR1cm4lMjBoKDMpJTVDbiU3RCU1Q24lNUNuZnVuY3Rpb24lMjBoKHgpJTIwJTdCJTVDbiUyMCUyMGNvbnNvbGUubG9nKCdoJyUyQyUyMHgpJTVDbiUyMCUyMGlmJTIwKE1hdGgucmFuZG9tKCklMjAlM0UlMjAwLjUpJTIwcmV0dXJuJTIwaSgxKSU1Q24lMjAlMjBpZiUyMChNYXRoLnJhbmRvbSgpJTIwJTNFJTIwMC41KSUyMHJldHVybiUyMGkoMiklNUNuJTIwJTIwcmV0dXJuJTIwaSgzKSU1Q24lN0QlNUNuJTVDbmZ1bmN0aW9uJTIwaSh4KSUyMCU3QiU1Q24lMjAlMjBjb25zb2xlLmxvZygnaSclMkMlMjB4KSU1Q24lMjAlMjBpZiUyMChNYXRoLnJhbmRvbSgpJTIwJTNFJTIwMC41KSUyMHJldHVybiUyMGooMSklNUNuJTIwJTIwaWYlMjAoTWF0aC5yYW5kb20oKSUyMCUzRSUyMDAuNSklMjByZXR1cm4lMjBqKDIpJTVDbiUyMCUyMHJldHVybiUyMGooMyklNUNuJTdEJTVDbiU1Q25mdW5jdGlvbiUyMGooeCklMjAlN0IlNUNuJTIwJTIwY29uc29sZS5sb2coJ2onJTJDJTIweCklNUNuJTIwJTIwaWYlMjAoTWF0aC5yYW5kb20oKSUyMCUzRSUyMDAuNSklMjByZXR1cm4lMjBrKDEpJTVDbiUyMCUyMGlmJTIwKE1hdGgucmFuZG9tKCklMjAlM0UlMjAwLjUpJTIwcmV0dXJuJTIwaygyKSU1Q24lMjAlMjByZXR1cm4lMjBrKDMpJTVDbiU3RCU1Q24lNUNuZnVuY3Rpb24lMjBrKHgpJTIwJTdCJTVDbiUyMCUyMGNvbnNvbGUubG9nKCdrJyUyQyUyMHgpJTVDbiUyMCUyMGlmJTIwKE1hdGgucmFuZG9tKCklMjAlM0UlMjAwLjUpJTIwcmV0dXJuJTIwbCgxKSU1Q24lMjAlMjBpZiUyMChNYXRoLnJhbmRvbSgpJTIwJTNFJTIwMC41KSUyMHJldHVybiUyMGwoMiklNUNuJTIwJTIwcmV0dXJuJTIwbCgzKSU1Q24lN0QlNUNuJTVDbmZ1bmN0aW9uJTIwbCh4KSUyMCU3QiU1Q24lMjAlMjBjb25zb2xlLmxvZygnbCclMkMlMjB4KSU1Q24lMjAlMjBpZiUyMChNYXRoLnJhbmRvbSgpJTIwJTNFJTIwMC41KSUyMHJldHVybiUyMG0oMSklNUNuJTIwJTIwaWYlMjAoTWF0aC5yYW5kb20oKSUyMCUzRSUyMDAuNSklMjByZXR1cm4lMjBtKDIpJTVDbiUyMCUyMHJldHVybiUyMG0oMyklNUNuJTdEJTVDbiU1Q25mdW5jdGlvbiUyMG0oeCklMjAlN0IlNUNuJTIwJTIwY29uc29sZS5sb2coJ20nJTJDJTIweCklNUNuJTIwJTIwaWYlMjAoTWF0aC5yYW5kb20oKSUyMCUzRSUyMDAuNSklMjByZXR1cm4lMjBuKDEpJTVDbiUyMCUyMGlmJTIwKE1hdGgucmFuZG9tKCklMjAlM0UlMjAwLjUpJTIwcmV0dXJuJTIwbigyKSU1Q24lMjAlMjByZXR1cm4lMjBuKDMpJTVDbiU3RCU1Q24lNUNuZnVuY3Rpb24lMjBuKHgpJTIwJTdCJTVDbiUyMCUyMGNvbnNvbGUubG9nKCduJyUyQyUyMHgpJTVDbiUyMCUyMGlmJTIwKE1hdGgucmFuZG9tKCklMjAlM0UlMjAwLjUpJTIwcmV0dXJuJTIwbygxKSU1Q24lMjAlMjBpZiUyMChNYXRoLnJhbmRvbSgpJTIwJTNFJTIwMC41KSUyMHJldHVybiUyMG8oMiklNUNuJTIwJTIwcmV0dXJuJTIwbygzKSU1Q24lN0QlNUNuJTVDbmZ1bmN0aW9uJTIwbyh4KSUyMCU3QiU1Q24lMjAlMjBjb25zb2xlLmxvZygnbyclMkMlMjB4KSU1Q24lMjAlMjBpZiUyMChNYXRoLnJhbmRvbSgpJTIwJTNFJTIwMC41KSUyMHJldHVybiUyMHAoMSklNUNuJTIwJTIwaWYlMjAoTWF0aC5yYW5kb20oKSUyMCUzRSUyMDAuNSklMjByZXR1cm4lMjBwKDIpJTVDbiUyMCUyMHJldHVybiUyMHAoMyklNUNuJTdEJTVDbiU1Q25mdW5jdGlvbiUyMHAoeCklMjAlN0IlNUNuJTIwJTIwY29uc29sZS5sb2coJ3AnJTJDJTIweCklNUNuJTIwJTIwaWYlMjAoTWF0aC5yYW5kb20oKSUyMCUzRSUyMDAuNSklMjByZXR1cm4lMjBxKDEpJTVDbiUyMCUyMGlmJTIwKE1hdGgucmFuZG9tKCklMjAlM0UlMjAwLjUpJTIwcmV0dXJuJTIwcSgyKSU1Q24lMjAlMjByZXR1cm4lMjBxKDMpJTVDbiU3RCU1Q24lNUNuZnVuY3Rpb24lMjBxKHgpJTIwJTdCJTVDbiUyMCUyMGNvbnNvbGUubG9nKCdxJyUyQyUyMHgpJTVDbiUyMCUyMGlmJTIwKE1hdGgucmFuZG9tKCklMjAlM0UlMjAwLjUpJTIwcmV0dXJuJTIwcigxKSU1Q24lMjAlMjBpZiUyMChNYXRoLnJhbmRvbSgpJTIwJTNFJTIwMC41KSUyMHJldHVybiUyMHIoMiklNUNuJTIwJTIwcmV0dXJuJTIwcigzKSU1Q24lN0QlNUNuJTVDbmZ1bmN0aW9uJTIwcih4KSUyMCU3QiU1Q24lMjAlMjBjb25zb2xlLmxvZygnciclMkMlMjB4KSU1Q24lMjAlMjBpZiUyMChNYXRoLnJhbmRvbSgpJTIwJTNFJTIwMC41KSUyMHJldHVybiUyMHMoMSklNUNuJTIwJTIwaWYlMjAoTWF0aC5yYW5kb20oKSUyMCUzRSUyMDAuNSklMjByZXR1cm4lMjBzKDIpJTVDbiUyMCUyMHJldHVybiUyMHMoMyklNUNuJTdEJTVDbiU1Q25mdW5jdGlvbiUyMHMoeCklMjAlN0IlNUNuJTIwJTIwY29uc29sZS5sb2coJ3MnJTJDJTIweCklNUNuJTIwJTIwcmV0dXJuJTIwdHJ1ZSUzQiU1Q24lN0QlNUNuJTVDbmlmJTIwKGEoKSklMjBjb25zb2xlLmxvZygnT0snKSUzQiU1Q24lNUNuJTIyJTJDJTIyaXNFbnRyeSUyMiUzQXRydWUlMkMlMjJuYW1lJTIyJTNBJTIybWFpbi5qcyUyMiU3RCU1RCUyQyUyMm9wdGlvbnMlMjIlM0ElN0IlN0QlN0Q=

As this is an artificial example, many code-bases will not have this issue. But some will, or at least they will be noticeably slowed, especially if they have some functions with a lot of return statements. To that end, I would make this feature opt-in, e.g. as treeshake.trackAllReturnValues, but add this to the treeshake.preset: "smallest" preset.

So in general, this is how I would propose to proceed

  • Look into aborting binding in a separate PR.
  • Tracking multiple return values shop be opt-in via parameter
  • Can we implement tracking multiple return statements without the additional "isLocallyReachable" static analysis? While the logic is not too complicated, I am worried that it is another separate layer of executed code path analysis that is only used for this one feature, where bugs may be noticed very late, and which will not benefit from improvements in other areas. Another way to do this could be to start tracking return statements the first time either hasEffects, include or includePath are called on a return statement, as that is equivalent to "we know this return statement is in an executed code path". This is also the semantic we use for applyDeoptimizations and actually, this could just become another "deoptimization" to perform on return statements I their applyDeoptimizations method, i.e. add a return value. Now the important thing would be to make sure that when a return statement is added and a consumer already queried the literal return value of the function (tracked via the origin parameter), then that consumer is deoptimized. However, I hope this is not a wild goose chase.

What do you think?

Comment on lines +89 to +92
if (!this.blockEndReached && node.haltsCodeFlow()) {
this.blockEndReached = true;
this.lastReachableBlock = index;
}
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting question if we can just stop binding here. I first thought "no" because there can be hoisted declarations after such a point, but those do not need binding anyway, so this might actually be fine.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I initially implemented it that way, but it did cause issues with function declarations in effectively unreachable blocks that were no longer properly dealt with iirc.

Comment on lines +104 to +109
haltsCodeFlow(allowOptimizations?: boolean): boolean {
for (const node of this.body) {
if (node.haltsCodeFlow(allowOptimizations)) return true;
}
return false;
}
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For effect checks and inclusion, we already have a different solution that effectively does the same thing, but slightly differently. This is what the execution context and "brokenFlow" are designed for:

brokenFlow: boolean;

This can effectively track how far e.g. a labelled break or a return statement interrupt the code flow. I am not sure we should build a different mechanism in parallel.

Admittedly, this solution is not "simpler" than your solution as you would still need to write some logic for the bind handler on several different nodes. Just do a full text search for brokenFlow to see what would need to be touched. This logic relies on a context that is passed to each call of include or hasEffects which tracks if we are in reachable code and which labels we have, basically what is in the ControlFlowContext interface. This can then abort hasEffects checks in BlockStatements and SwitchCases

If we go there, I would hope we can move this to a separate PR to make the review easier and allow a separate release.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I initially used that, but since it relies on hasEffects which is a more expensive function and does not really allow granularity with whether to take optimizations into account or not. I also feared the more complex rules around break statements and continue statements would make it quite a lot more difficult to map to the concept of reachability, since breaking out of a loop and returning in a loop have different code reachability implications that'd be difficult to distinguish.

Initially I wanted to use treeshaking information for local reachability, but it's actually not going to work well because then it'd mean getting accurate treeshaking information requires accurate treeshaking information, so we have a chicken and egg problem.

So my idea here was to make a pessimistic, lower-quality and coarse check that'd be cheap enough to call without fear of harming performance too much, that can be used to solve that issue. Given Rollup's design, it's impossible to assume a node included until proven dead code, since it works the other way around and there's no way to exclude a node once it's been included.

Another concern is that I'm not sure whether the hasEffects check is going to be as exhaustive as I'd like it to be, especially if it aborts early because it knows there are effects quite a bit higher up in the hierarchy and doesn't bother exploring further (which would be a correct behavior given what it tries to establish, but that wouldn't give me the information I'm looking for I think... 🤔)

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, another issue seems to be that if return statements start to get included, that means we already gave up on the idea of getting rid of the function as a whole, meaning if we end up being able to inline all the places where it's called it'll still be included in the bundle (#5063)

Copy link
Copy Markdown
Contributor Author

@cyyynthia cyyynthia left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, you're definitely right that this kind of performance regression is unacceptable (regardless of the exotic/synthetic nature of the scenario). I convinced myself along the way that it shouldn't be too much of a performance issue but I overlooked the exponential growth of return values in deep recursive function call chains that can definitely show up in the real world.

Having full-depth search opt-in seems a good idea. I'm wondering if depth control would allow for a good balance between performance and output quality (by default have a depth of only 1 or 2, and in the smallest preset have a depth of Infinity; i.e. exhaustive return value search.

It's very likely that performance can be improved for some scenarios through caching and smarter exploration, perhaps beyond just this use-case if getLiteralValue* gets smarter. I'll explore this further :)

Comment on lines +89 to +92
if (!this.blockEndReached && node.haltsCodeFlow()) {
this.blockEndReached = true;
this.lastReachableBlock = index;
}
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I initially implemented it that way, but it did cause issues with function declarations in effectively unreachable blocks that were no longer properly dealt with iirc.

Comment on lines +104 to +109
haltsCodeFlow(allowOptimizations?: boolean): boolean {
for (const node of this.body) {
if (node.haltsCodeFlow(allowOptimizations)) return true;
}
return false;
}
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I initially used that, but since it relies on hasEffects which is a more expensive function and does not really allow granularity with whether to take optimizations into account or not. I also feared the more complex rules around break statements and continue statements would make it quite a lot more difficult to map to the concept of reachability, since breaking out of a loop and returning in a loop have different code reachability implications that'd be difficult to distinguish.

Initially I wanted to use treeshaking information for local reachability, but it's actually not going to work well because then it'd mean getting accurate treeshaking information requires accurate treeshaking information, so we have a chicken and egg problem.

So my idea here was to make a pessimistic, lower-quality and coarse check that'd be cheap enough to call without fear of harming performance too much, that can be used to solve that issue. Given Rollup's design, it's impossible to assume a node included until proven dead code, since it works the other way around and there's no way to exclude a node once it's been included.

Another concern is that I'm not sure whether the hasEffects check is going to be as exhaustive as I'd like it to be, especially if it aborts early because it knows there are effects quite a bit higher up in the hierarchy and doesn't bother exploring further (which would be a correct behavior given what it tries to establish, but that wouldn't give me the information I'm looking for I think... 🤔)

@lukastaegert
Copy link
Copy Markdown
Member

Hi, I am somewhat I slightly lost track of this. What is the current state here, is this currently in a releasable state from your side? Then I would focus on finishing the review next.

@cyyynthia
Copy link
Copy Markdown
Contributor Author

Hi! The state of things hasn't moved much since your last review and the concerns you raised still are valid. I tried a bunch of things locally, with more often than not more issues created than solved... And then I ran out of free time to mess with this PR :(

The main issue remains that the information I need for the reachability check is not known prior to the inclusion process being in progress, at which point it is too late to react (since Rollup works via an additive only strategy). This is the key to solve #5063 robustly, and is quite important to make deep function return evaluation worth the effort.

The PR can likely be split in 2 separate ones; one adding the deep function return value evaluation capabilities, the other making reachability analysis smarter such that treeshaking can have a "recursive" effect without resorting to treeshaking over and over until we have a stable result which would be prohibitively expensive >:)

There are some low hanging fruits that can be cherry picked from the PR and improves some situations too, such as making blocks return undefined (instead of UnknownValue) through their implicit return statement.

@lukastaegert
Copy link
Copy Markdown
Member

Thanks for the quick response! I think I will leave it like this for a little longer from my side and focus on getting Rollup 5 ready, there is a lot still to do. After happy I am happy to invest more time here. But feel free to drive this forward in the meantime yourself.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

tree-shaking will not execute recursively.

2 participants