The first time I shipped a tiny “refactor” that swapped [] for new Array(n), I learned the hard way that array creation is not a cosmetic choice. A seemingly harmless change flipped a list of prices into a sparse array of empty slots, and a downstream loop quietly skipped them. No crash, no red error, just wrong results. If you’ve ever stared at length and wondered why it doesn’t match what you see, you’ve met this edge. In this post I’ll walk you through the practical differences between Array() and [], why the constructor behaves differently with a single number, and how that can leak into bugs, performance surprises, and even JSON serialization oddities. I’ll show you how I reason about arrays in 2026-era codebases, when I still reach for new Array(), and how you can make your intent obvious to teammates and to tools. Expect real-world examples, minimal fluff, and a clear recommendation you can apply today.
Arrays Are Objects With Two Kinds of “Empty”
JavaScript arrays are objects with numeric keys and a special length property. That sounds simple until you bump into the difference between “missing elements” and “elements that exist but are undefined.” The array constructor creates empty slots, while array literals usually create actual elements.
Empty slots are not the same as undefined. They behave differently in iteration methods, in serialization, and in how engines optimize memory. That distinction is the core of the Array() vs [] difference. When I debug array-related issues in production, I almost always check for sparse arrays (arrays with empty slots) because they can silently change behavior.
Here’s a quick visual:
const literal = [undefined, undefined];
const constructor = new Array(2);
console.log(literal.length); // 2
console.log(constructor.length); // 2
console.log(literal); // [ undefined, undefined ]
console.log(constructor); // [ ]
console.log(0 in literal); // true
console.log(0 in constructor); // false
Both arrays report length 2, but the first one actually has elements at index 0 and 1. The second one has holes. This is why a single numeric argument to Array() is not just a convenience; it’s a different data shape.
The Constructor Rule That Trips Everyone
Array() (or new Array()) is overloaded. Its behavior depends on the number and type of arguments:
- No arguments → empty array
[] - One non-number argument → array with that single element
- One number argument → array with length set to that number (holes)
- Two or more arguments → array containing those arguments
That single-number case is the trap. Compare these side by side:
const a = new Array(5);
const b = [5];
console.log(a); // [ ]
console.log(b); // [ 5 ]
console.log(a.length); // 5
console.log(b.length); // 1
This isn’t a small difference. It changes what your loops see, how methods behave, and whether your array is dense or sparse. In my experience, most array bugs caused by Array() are due to this single numeric argument rule.
Literal Syntax: Predictable and Explicit
Array literals ([]) are the most predictable way to create arrays. They never interpret numbers as sizes; they always treat what you write as elements. That predictability is why I recommend literals for 95% of array creation.
const prices = [19.99, 29.99, 9.99];
const empty = [];
const one = [5];
No surprises. You get the elements you list, in the order you list them, and length is equal to the number of elements. If you want to emphasize “this is a list of values,” the literal is the clearest possible signal to both humans and linters.
I also like that array literals are shorter and more readable, especially in code reviews. When I scan a diff, [5] tells me “this array has one element.” new Array(5) tells me “maybe a preallocated array? maybe a mistake?”
Constructor Syntax: When It’s Actually Useful
There are legitimate uses for new Array(n). I still use it when I explicitly want empty slots or I’m planning to fill the array by index and want the length pre-set. But that’s rare, and I document it when I do.
Example: You want a fixed-length array that you fill with for loops and you don’t want to initialize values yet.
const slots = new Array(3);
for (let i = 0; i < slots.length; i++) {
// I’m reserving space and will fill later
slots[i] = i * 2;
}
console.log(slots); // [ 0, 2, 4 ]
Note what happens: once you assign values, the holes are replaced by actual elements. If you’re going to fill every slot anyway, you can also use Array.from or fill, which I’ll cover later.
In modern code, I prefer Array.from({ length: n }) or Array(n).fill(...) because they’re explicit about the intent. But there are still cases where the constructor is the simplest way to express “sparse array of length n.”
Iteration Differences That Change Your Logic
This is the part that surprises most people. Many array methods skip empty slots. If you create an array with new Array(5), several methods behave as if the elements don’t exist at all.
const sparse = new Array(3);
const dense = [undefined, undefined, undefined];
console.log(sparse.map((_, i) => i)); // [ ]
console.log(dense.map((_, i) => i)); // [ 0, 1, 2 ]
console.log(sparse.forEach((_, i) => console.log(i))); // nothing
console.log(dense.forEach((_, i) => console.log(i))); // 0, 1, 2
This is not just a curiosity. Imagine a validation pass that uses forEach to check values. A sparse array won’t execute the callback for holes, so your validation silently skips them. That has real consequences.
I’ve seen production bugs where a sparse array bypassed transformation logic, and the original empty slots survived into JSON output as null or simply vanished. That’s why I avoid creating sparse arrays unless I’m intentionally working with them.
Serialization: JSON and Spread Surprises
When you serialize arrays, holes behave differently than undefined values. For JSON:
const sparse = new Array(2);
const dense = [undefined, undefined];
console.log(JSON.stringify(sparse)); // "[null,null]" in most engines
console.log(JSON.stringify(dense)); // "[null,null]"
JSON turns both holes and undefined into null, so on the surface they look the same. But in other contexts they diverge. The spread operator and Array.prototype.concat copy holes differently than explicit undefined values.
const sparse = new Array(2);
const dense = [undefined, undefined];
console.log([...sparse]); // [ undefined, undefined ] (holes become undefined)
console.log([...dense]); // [ undefined, undefined ]
Spreading a sparse array materializes holes into undefined values. That can be helpful or confusing, depending on what you expect. If your code depends on the presence or absence of keys, Array() may be the wrong foundation.
Practical Edge Cases I See in Real Projects
Here are a few situations where the Array() vs [] difference matters in practice:
1) Length-Based Loops
If you do a for loop with i < arr.length, holes still count. That means you can iterate by index and fill or check holes, but if you use higher-order methods, those holes disappear.
const report = new Array(3);
for (let i = 0; i < report.length; i++) {
if (!(i in report)) {
report[i] = "pending"; // fill missing entries
}
}
console.log(report); // ["pending", "pending", "pending"]
2) Validation or Sanitization
Using map, filter, or forEach against a sparse array can silently skip slots. If you need to validate that an array has a value for every index, a sparse array is already a red flag.
const entries = new Array(3);
const validated = entries.map(v => v ?? "default");
console.log(validated); // [ ]
If you intended defaults, you need a dense array or a different creation method.
3) UI Rendering Lists
Frameworks like React or Vue expect arrays of items. A sparse array may render fewer list items than its length suggests. This can show up as “missing rows” with no obvious error.
I’ve watched teams spend hours on “off-by-one” bugs that were actually “holes in a list.” The fix was simply to replace new Array(n) with Array.from({ length: n }) and map over it.
4) Memory Expectations
Developers sometimes use new Array(n) hoping for a performance boost, but modern engines handle dense arrays and typed arrays more predictably. If you need memory-efficient fixed-size numeric data, a typed array like Float32Array or Uint32Array is more appropriate in 2026 than a sparse JS array.
Modern, Explicit Alternatives I Recommend
When I want a list of length n and I want every slot to exist, I rarely use new Array(n) alone. I use one of these patterns instead. They make intent obvious and avoid holes.
Array.from
const count = 5;
const ids = Array.from({ length: count }, (_, i) => ID-${i + 1});
console.log(ids); // ["ID-1", "ID-2", "ID-3", "ID-4", "ID-5"]
This is my default in modern code. It’s readable, supports a mapping function, and avoids sparse arrays entirely.
fill
const seats = new Array(4).fill("available");
console.log(seats); // ["available", "available", "available", "available"]
fill is great for default values. It’s explicit and easy to scan. Just be careful with objects—fill({}) shares the same object reference across all slots.
Array.from with keys
const indices = Array.from({ length: 3 }, (_, i) => i);
console.log(indices); // [0, 1, 2]
This makes index arrays without holes, and it’s more self-explanatory than new Array(3).keys() in my opinion.
Typed Arrays for Numeric Data
When I need numeric arrays for performance or memory predictability, I use typed arrays. They behave very differently from JS arrays but solve a different class of problems.
const weights = new Float64Array(3);
weights[0] = 1.5;
weights[1] = 2.0;
weights[2] = 2.5;
Typed arrays are zero-initialized and always dense. If you’re measuring performance in tight loops, these can be faster and more cache-friendly.
Common Mistakes and How I Avoid Them
I see the same traps over and over. Here’s how I work around them in my own code and when reviewing code from others.
Mistake 1: new Array(1) for a Single Element
This creates a length-1 sparse array, not an array with value 1.
Fix: Use [1] or Array.of(1).
const one = Array.of(1); // [1]
Mistake 2: Using map on a Sparse Array
The callback never runs for holes, so you get holes back.
Fix: If you want to map, create a dense array first.
const data = Array.from({ length: 4 }, () => 0).map(v => v + 1);
console.log(data); // [1, 1, 1, 1]
Mistake 3: Assuming length Means “Elements”
length can be larger than the number of actual elements. A sparse array can have length = 100 and only one element at index 99.
Fix: If you care about actual elements, use Object.keys(arr).length or iterate with for...of and count. Better: avoid sparse arrays unless you truly want them.
Mistake 4: Shallow Copies of filled Objects
If you fill with an object, all elements reference the same object.
Fix: Use Array.from to create distinct objects.
const rows = Array.from({ length: 3 }, () => ({ status: "new" }));
When I Actually Choose Array()
I rarely use Array() because it invites mistakes in code review. But I do use it in a few specific situations:
1) Creating a sparse array on purpose.
If I want to track “known” vs “unknown” positions and I want holes to represent “missing,” a sparse array makes sense. I document this clearly in code comments.
2) Legacy APIs that expect sparse arrays.
Some older libraries treat holes as “not present” and use that as a signal. In those cases, I follow the API’s contract.
3) Performance experiments in V8 or SpiderMonkey.
Occasionally I benchmark code paths and discover that a sparse array is faster for a very specific workload (typically around creation cost). That’s rare and not the default.
If you use Array(), I recommend you do it with Array.of or with multiple arguments to avoid the single-number ambiguity:
const ok1 = Array.of(5); // [5]
const ok2 = Array.of(1, 2); // [1, 2]
Array.of was added exactly to fix the confusing constructor semantics. It never treats a number as length.
A Clear Recommendation You Can Apply Today
If you write modern JavaScript for teams, I suggest this rule of thumb:
- Use array literals (
[]) for values. - Use
Array.fromorfillfor pre-sized arrays that need values. - Use
Array.ofif you want a constructor-like call with single numbers. - Use
new Array(n)only when you intentionally want holes and can explain why in one sentence.
This isn’t about style preferences; it’s about avoiding latent bugs and making your intent obvious. When I read a codebase and see new Array(5), I assume it is either a mistake or a special case. If it’s special, the code should say so.
A Deeper Look at “Empty Slots” in Practice
Let’s explore how empty slots behave across common operations. This is where the constructor’s difference becomes visible in production.
in and hasOwnProperty
const sparse = new Array(2);
const dense = [undefined, undefined];
console.log(0 in sparse); // false
console.log(0 in dense); // true
console.log(sparse.hasOwnProperty(0)); // false
console.log(dense.hasOwnProperty(0)); // true
Empty slots don’t exist as properties. That matters if you’re checking presence, validating indexes, or doing proxy-based reactivity.
for...of vs for...in
const sparse = new Array(3);
for (const v of sparse) {
console.log(v); // nothing, because holes are skipped
}
for (const i in sparse) {
console.log(i); // nothing
}
Both loops skip holes. If you need to iterate all indices including holes, you must use a classic for loop or Array.prototype.keys() with careful handling.
reduce
reduce skips holes too, and it can throw if there are no existing elements and no initial value.
const sparse = new Array(3);
try {
sparse.reduce((a, b) => a + b);
} catch (e) {
console.log("reduce failed");
}
This behavior can be baffling if you expected reduce to run with undefined values. It won’t. Again: holes are not values.
Real-World Scenario: Building a Calendar Grid
Here’s a common UI scenario: you want to build a 6×7 calendar grid with 42 cells. If you use new Array(42) and then map over it, nothing happens. That’s a classic bug.
Here’s the correct pattern, which is both explicit and safe:
const totalCells = 42;
const today = new Date();
const year = today.getFullYear();
const month = today.getMonth();
const firstOfMonth = new Date(year, month, 1);
const startOffset = firstOfMonth.getDay(); // 0-6
const cells = Array.from({ length: totalCells }, (_, i) => {
const dayNumber = i - startOffset + 1;
const date = new Date(year, month, dayNumber);
const isCurrentMonth = date.getMonth() === month;
return { date, isCurrentMonth };
});
const rows = [];
for (let i = 0; i < cells.length; i += 7) {
rows.push(cells.slice(i, i + 7));
}
This pattern uses Array.from to guarantee dense arrays, which means every map and slice works as expected. If you had used new Array(42).map(...) here, you’d get 42 holes back, and your calendar would render nothing.
Another Practical Scenario: Chunking and Pagination
Pagination is another place sparse arrays cause subtle bugs. Suppose you want to build placeholder rows while data is loading. You might think this is fine:
const placeholders = new Array(pageSize);
const rows = placeholders.map(() => ({ loading: true }));
But map never runs, so rows stays sparse. The UI might render zero placeholders and appear broken. The fix is to make it dense first:
const rows = Array.from({ length: pageSize }, () => ({ loading: true }));
If you want to keep placeholders lightweight but still iterable, you can also do:
const rows = new Array(pageSize).fill(null).map(() => ({ loading: true }));
This creates the actual elements before mapping, which is what your component expects.
Comparison Table: Literal vs Constructor vs Modern Helpers
Here’s a quick comparison to anchor the differences. I use this as a mental checklist when reviewing code.
Example
Risk Profile
—
—
[1, 2, 3]
Low risk, clear intent
new Array(3)
High risk in iteration
new Array(1, 2)
Medium risk (still ambiguous)
Array.of Array.of(3)
Low risk, explicit
Array.from Array.from({ length: 3 }, fn)
Low risk, explicit
fill new Array(3).fill(0)
Low risk, watch object refsI like to see Array.from or fill when the goal is a list of known length. I don’t want to guess whether a number is a value or a length.
How “Holes” Affect Common Methods (Quick Reference)
A few method behaviors are easy to forget, so I keep this at hand:
forEach,map,filter,reduce,every,some,find,findIndex: skip holes.for...of: skips holes.forloop with index: does not skip holes (because you checklength).Object.keys(array): returns only existing indices.Array.prototype.keys(): returns all indices, including holes (as an iterator).- Spread (
[...arr]): materializes holes asundefined. Array.from(arr): materializes holes asundefined.JSON.stringify: turns holes intonull.
Whenever I see a data pipeline involving map or filter, I ask myself whether the input could be sparse. If yes, I either densify it or avoid sparse creation in the first place.
The Subtle Difference Between “Create Then Fill” Patterns
These two patterns are not equivalent, and it matters:
const a = new Array(3).map(() => 1); // [ ]
const b = new Array(3).fill(0).map(() => 1); // [1, 1, 1]
map only runs on existing elements. That’s why the first line does nothing. I’ve seen this bug in codebases more than once, especially when developers try to reduce allocations. The fix is simple: add fill or use Array.from.
If you need distinct objects, use:
const rows = Array.from({ length: 3 }, () => ({ ready: false }));
This avoids shared object references and avoids holes.
“But I Want to Pre-Allocate for Performance”
Sometimes developers reach for new Array(n) because they believe it pre-allocates memory and speeds up pushes. In modern engines, pre-allocation doesn’t usually help in a meaningful way unless you’re in a hot path and have proof.
Here’s the approach I take when performance matters:
1) Start with clarity: use Array.from or literals.
2) Measure: use profiling tools, not assumptions.
3) Switch to typed arrays if you need numeric performance or predictable memory.
If you’ve actually measured and found that new Array(n) helps, great. But it’s a specialized optimization, and I’d still comment the intent so future maintainers don’t “fix” it back to a literal.
A More Complete Example: Report Aggregation
Let’s build a small, realistic pipeline: data arrives from an API, we want to normalize it into a report, and we need to guarantee a fixed number of slots for a UI summary.
Bad version (sparse bug):
const reportSize = 5;
const report = new Array(reportSize);
const normalized = report.map((_, i) => ({
index: i,
value: 0
}));
console.log(normalized); // [ ]
Correct version (dense array):
const reportSize = 5;
const normalized = Array.from({ length: reportSize }, (_, i) => ({
index: i,
value: 0
}));
Now you can safely map, filter, and render. This small change also makes your intent explicit: “I want a list of length 5 and I want each slot to exist.”
Another Useful Primitive: Array.of
Most people forget Array.of, but I use it when I want a constructor call that never treats numbers as lengths.
const a = Array.of(5); // [5]
const b = Array.of(5, 6); // [5, 6]
const c = Array.of("5"); // ["5"]
Why bother? In some code styles, people prefer constructor-like factory calls for chaining or for generic utilities. Array.of removes the ambiguity in those cases.
Sparse Arrays as a Data Structure (When It’s Actually Fine)
Sparse arrays aren’t inherently “bad.” They’re just a different shape that needs different expectations. If I deliberately want to model “unknown or missing” positions, a sparse array can be a good fit.
For example, suppose I’m tracking sparse edits in a massive timeline where most positions are untouched. Storing only the touched indices can be efficient. A sparse array lets me do that without creating huge dense lists of null values.
const edits = new Array(1000);
// Only store actual edits
edits[10] = { type: "insert", value: "A" };
edits[200] = { type: "delete" };
console.log(10 in edits); // true
console.log(11 in edits); // false
In this case, Array() is the right tool, but I still keep the invariant clear: holes mean “no edit.” If someone tries to map over edits, I want a review comment that explains why that’s incorrect.
Edge Case: Holes and filter
filter also skips holes. That means you can accidentally “lose” intended positions.
const sparse = new Array(3);
const filtered = sparse.filter(Boolean);
console.log(filtered); // []
If your pipeline uses filter to remove falsy values, but your input might be sparse, you’re not just removing falsy values—you’re removing slots that didn’t exist. That can break index alignment. The safe approach is to densify first, or to use index-based loops.
Edge Case: includes and indexOf
Both includes and indexOf skip holes, which leads to weird results:
const sparse = new Array(3);
console.log(sparse.includes(undefined)); // true in some engines? often false
console.log(sparse.indexOf(undefined)); // -1
The behavior can be surprising because you might assume holes are treated as undefined. They are not. This is another place where using dense arrays avoids ambiguity.
Edge Case: sort and join
Sorting and joining sparse arrays can produce results that look like you had undefined values, even though you didn’t.
const sparse = new Array(3);
console.log(sparse.join(",")); // ",," (empty strings)
This is why “it worked in a log” is not a guarantee that your array is dense. Logging often hides the distinction.
A Minimal Debugging Checklist I Use
When I suspect a sparse array bug, I run through a quick checklist:
1) Check length vs Object.keys(arr).length.
2) Use arr.hasOwnProperty(i) for a known index.
3) Spread it: [...arr] to see if holes materialize.
4) Log arr in a console that shows slots.
5) Search for new Array(n) or Array(n) in the creation path.
Most of the time, step 1 or 2 reveals the issue immediately.
Tooling and Team Practices in 2026 Codebases
Today, I rarely rely on human memory to catch these mistakes. I use tools and conventions to prevent them:
- Lint rules: Many teams disallow
Array(n)ornew Array(n)by default unless explicitly allowed. - Code review checks: If I see
new Array(n)without.fillorArray.from, I flag it. - TypeScript hints: I sometimes model sparse arrays with explicit types or comments to avoid misuse.
Even a small lint rule—like “disallow Array constructor with a single numeric argument”—can prevent dozens of bugs over time.
Performance Considerations (Without the Hype)
There’s a lot of folklore about arrays and performance. Here’s my practical, measured view:
- Creation cost:
new Array(n)can be marginally faster to allocate, but if you then fill it, the cost usually evens out. - Iteration: Dense arrays are typically optimized better. Sparse arrays can de-optimize JIT paths because they behave more like generic objects.
- Memory: Sparse arrays can save memory when most indices are empty, but the overhead of sparse maps and property lookups can outweigh this in some cases.
The safe guidance is: write for clarity first, then measure. If you truly need performance, consider typed arrays or specialized data structures.
Alternative Patterns You Might Not Be Using
These aren’t strictly about Array() vs [], but they solve the same “I need a list of length N” problem without the pitfalls.
Array.from + keys for indices
const indices = [...Array(5).keys()];
console.log(indices); // [0, 1, 2, 3, 4]
This uses a sparse array but immediately materializes it into a dense one through the iterator. It’s okay, but I still prefer:
const indices = Array.from({ length: 5 }, (_, i) => i);
It reads better in my opinion.
Object.fromEntries for indexed maps
If you need a mapping structure rather than a list, skip arrays entirely:
const map = Object.fromEntries(
Array.from({ length: 3 }, (_, i) => [i, { value: 0 }])
);
This avoids sparse array semantics altogether.
Map when indices are non-contiguous
If you’re using a sparse array to store non-contiguous indices, a Map may be more honest and often clearer:
const edits = new Map();
edits.set(10, { type: "insert" });
edits.set(200, { type: "delete" });
This makes “missing” explicit rather than implicit.
A Quick Word on Array() Without new
In JavaScript, Array() and new Array() behave the same. I still prefer the new form if I use it, because it signals “constructor semantics.” But in a team setting I avoid both, because they’re easy to misread.
If you want a factory style, Array.of is the most unambiguous option.
The length Property: Why It Misleads People
length is not a count of elements; it’s one more than the highest index, or an explicit length for sparse arrays. That’s why this works:
const arr = [];
arr[10] = "x";
console.log(arr.length); // 11
But arr only has one element. This property is powerful but easy to misinterpret. If your logic assumes length equals the number of items, you should avoid sparse arrays and avoid direct index assignments beyond the current end.
Real-World Scenario: Data Normalization for Charts
Charts are another spot where sparse arrays hurt. Suppose your chart expects 12 monthly values. If you do this:
const months = new Array(12);
const values = months.map(() => 0); // oops
You’ll pass a sparse array to the chart, and the library might ignore holes or treat them as missing data points. The correct version is explicit:
const values = Array.from({ length: 12 }, () => 0);
Now every month has a value, and your chart can render consistently.
Pragmatic Rules I Use in Code Review
I don’t like fuzzy rules, so I keep these in mind when reviewing PRs:
1) If I see Array(n) or new Array(n) without .fill or Array.from, I assume it’s a bug until proven otherwise.
2) If the code uses map, forEach, or filter, I check whether the array could be sparse.
3) If the code uses length as a count of elements, I check for sparse creation upstream.
4) If sparse arrays are intentional, I expect a short comment that says so.
These rules catch 90% of the mistakes I see in large codebases.
FAQ I Hear From Teams
“Is Array() ever faster than []?”
Sometimes in micro-benchmarks, yes. But in real-world code, the difference is usually noise compared to clarity. If you have a performance hotspot, measure it. Otherwise, default to literals.
“Does Array.from cost more?”
It’s a little heavier than [], but it gives you the array you actually need. If you’re creating a list of length N anyway, Array.from is one of the most readable choices.
“Why does new Array(1) exist if it’s so confusing?”
It’s historical. JavaScript’s original constructor semantics were designed long before modern patterns existed. Array.of was introduced to fix this confusion without breaking old code.
“Should we ban the Array constructor in lint?”
I often do. But I usually allow Array.of and allow new Array(n).fill(...) if it’s clearly intentional.
A Final Practical Example: Stubs and Testing
When writing tests, I often need “fake arrays” of a given length. Here’s the safe way I do it:
const makeUsers = (count) =>
Array.from({ length: count }, (_, i) => ({
id: i + 1,
name: User ${i + 1}
}));
This is explicit, produces a dense array, and avoids the empty-slot trap. It’s also self-documenting: anyone reading it knows the array is deliberately populated.
Summary: My Default Mental Model
If I had to summarize the difference in one sentence: [] creates actual elements, while Array(n) creates holes unless you give it non-numeric values or multiple arguments.
Whenever I’m unsure, I default to the literal or to Array.from, because they are explicit and predictable. The few times I choose new Array(n) are the times I truly want holes, and when I do, I make that intent painfully obvious.
Final Recommendation (Short Version)
If you need a quick rule to carry into your next code review:
- Use
[]for literal lists. - Use
Array.fromorfillfor fixed-length lists with values. - Use
Array.offor constructor-like creation without ambiguity. - Avoid
new Array(n)unless you truly want holes and can explain why.
That’s it. The array constructor isn’t wrong; it’s just sharp. When you choose your creation style deliberately, you avoid the silent edge cases and make your code easier to reason about for everyone who comes after you.


