perf: partially replace .concat with .push#13609
Conversation
|
Build successful! You can test your changes in the REPL here: https://babeljs.io/repl/build/48166/ |
|
This pull request is automatically built and testable in CodeSandbox. To see build info of the built libraries, click here or the icon next to each commit SHA. Latest deployment of this branch, based on commit 3e5a731:
|
79b1306 to
a9c7016
Compare
JLHwung
left a comment
There was a problem hiding this comment.
For more information, Shi Ling has posted an article why .push is faster than .concat: https://dev.to/uilicious/javascript-array-push-is-945x-faster-than-array-concat-1oki
|
From the article linked by @JLHwung:
Unless we have data showing that the first one is actually better, we can probably use the second one (except for in the helper). I'd expect all engines to optimize the spread operator when |
|
These are the results of a bench with // Setup
let arr0 = []
let arr1 = [0, 1]
// .push(...arr)
for (let i = 0; i < 2; i++) {
arr0.push(...arr1)
}
// .concat(arr)
for (let i = 0; i < 2; i++) {
arr0 = arr0.concat(arr1)
}
// .push.apply(this, arr)
for (let i = 0; i < 2; i++) {
Array.prototype.push.apply(arr0, arr1)
}Results: |
|
I have setup a benchmark case here: Anyway, |
|
Probably jsbench.me uses the browser engine to run the tests, so it used WebKit for mine. |
That really depends on the engine. For me on FF90, the difference is only about 1%. And for the tiny arrays you used, a custom function is much faster: // Setup
let arr0 = []
let arr1 = Array.from({length:10}, (_, i) => i);
let N = 10;
function extend(dest, src) {
const end = src.length;
for (let di = dest.length, si = 0; si < end; ) {
dest[di++] = src[si++];
}
}
// .push(...arr)
for (let i = 0; i < N; i++) {
arr0.push(...arr1)
}
// .push.apply(this, arr)
for (let i = 0; i < N; i++) {
Array.prototype.push.apply(arr0, arr1)
}
// extend(dest, src)
for (let i = 0; i < N; i++) {
extend(arr0, arr1);
}// .push(...arr) — 215666.98 ops/s ± 1.34% When I increase the source array length to 20, the custom function still wins by ~20%. It starts losing somewhere around length 50. For longer arrays it is slower, but it doesn't have a length limit like Array.prototype.push.apply([], {length:125000});
Uncaught RangeError: Maximum call stack size exceeded |
|
This code is going to run in V8 (Node.js) 99.9% of the times. Running @JLHwung's jsbench I get these results:
The |
c2c829b to
fcfed84
Compare
28784c5 to
be8a9b8
Compare
| */ | ||
| export default function removeTypeDuplicates( | ||
| nodes: ReadonlyArray<t.FlowType | false | null | undefined>, | ||
| nodes: (t.FlowType | false | null | undefined)[], |
There was a problem hiding this comment.
This is exposed as part of the public API, so we cannot change this.
We could clone nodes at the beginning of the function so that we can then .push() into it, but we would need a few benchmarks to check if it's worth it.
There was a problem hiding this comment.
In this case benchmarks are not that useful because we should evaluate a number of different cases.
| if (isTSUnionType(node)) { | ||
| if (typeGroups.indexOf(node.types) < 0) { | ||
| nodes = nodes.concat(node.types); | ||
| nodes.push(...node.types); |
There was a problem hiding this comment.
In the TS version nodes is not marked as readonly, should we align it to flow or vice versa?
There was a problem hiding this comment.
Yeah, we can mark the TS version as readonly. In general, public APIs should treat their parameters as readonly unless the function name clearly indicates that it's mutating.
We can do it in a separate PR to have the different changelog entry, since it affects a different kind of users.
packages/babel-types/src/modifications/flow/removeTypeDuplicates.ts
Outdated
Show resolved
Hide resolved
5d843ff to
b5d80a6
Compare
We change
this.concat(arr)calls withArray.prototype.push.apply(this, arr).Array.prototype.push.apply(this, arr)is quicker than.push(...arr)because it does not need to spread the array.