You’re staring at a list that renders newest items first: live chat messages, audit events, notifications, a timeline. The data arrives in chronological order, but the UI needs reverse-chronological order. Your first instinct is usually “put the new item at the front.” That’s exactly what Array.prototype.unshift() does—and it’s also where people accidentally introduce slowdowns, subtle bugs, and state-management headaches.
When I reach for unshift(), I do it with intent: I know it mutates the array, I know it has a cost proportional to the array size, and I know the return value is not the element I added. If you internalize those three points, you can use unshift() confidently in day-to-day JavaScript and TypeScript.
You’ll come away knowing how unshift() behaves (including edge cases), how it compares to modern “immutable update” patterns used in UI frameworks, and what I recommend when you’re dealing with large arrays, hot loops, or data structures that need fast prepends.
What unshift() Really Does to Your Array
At a high level, unshift() adds one or more elements to the beginning of an array and returns the array’s new length.
The key detail is how it makes room at index 0: it shifts existing elements to higher indexes. If your array has 4 elements, each element gets moved one position to the right. If your array has 40,000 elements, that’s a lot of movement.
A mental model I like is a row of seats in a theater. push() is adding a person at the end—easy. unshift() is adding a person at seat 1, which means everyone scoots over one seat. For small groups, nobody cares. For a packed theater, it gets loud.
Another important point: unshift() mutates the original array in place. If multiple parts of your program share the same array reference, all of them “see” the change. That can be what you want (data processing pipelines, single-owner arrays), or it can be a source of bugs (shared state, UI state, caching layers).
Finally, unshift() is an array method on Array.prototype. In modern JavaScript engines, it’s been stable for years, and you can treat it as universally available in real-world production environments.
A detail that matters in performance-sensitive code: arrays are optimized heavily when they behave like contiguous, dense lists. Prepends force the engine to update a lot of indexes. One unshift() here and there is rarely a problem, but repeated prepends tend to destroy the “cheap append” advantage you normally get with arrays.
If you’re ever unsure what’s happening, the simplest sanity check is to print indexes before and after:
function demoIndexShift() {
const arr = [‘b‘, ‘c‘, ‘d‘];
console.log(‘before:‘, arr.map((v, i) => ${i}:${v}).join(‘ ‘));
arr.unshift(‘a‘);
console.log(‘after :‘, arr.map((v, i) => ${i}:${v}).join(‘ ‘));
}
demoIndexShift();
That “after” line is your reminder: you didn’t just add one value; you changed where everything lives.
Syntax, Parameters, and Return Value (The Part People Misread)
The signature is straightforward:
array.unshift(element1, element2, ..., elementX);
- You can pass one element or many.
- The elements are inserted in the order you pass them.
- The method returns a number: the new
array.length.
That last bullet is where I see mistakes most often. People write something like:
const queue = [‘deploy‘, ‘backup‘];
const job = queue.unshift(‘migrate‘);
console.log(job); // Not ‘migrate‘.
job will be 3, because the array now has three items.
Here’s a complete runnable example that makes the behavior obvious:
function demoUnshiftBasics() {
const words = [‘team‘, ‘shipping‘, ‘today‘];
const newLength = words.unshift(‘We‘);
console.log(‘newLength:‘, newLength);
console.log(‘words:‘, words);
}
demoUnshiftBasics();
If you add multiple items, order is preserved:
function demoMultipleInsert() {
const ids = [301, 302, 303];
const newLength = ids.unshift(101, 102);
console.log(‘newLength:‘, newLength);
console.log(‘ids:‘, ids);
}
demoMultipleInsert();
You’ll get [101, 102, 301, 302, 303], not a reversed sequence.
One more subtlety I teach juniors: const does not mean “immutable.” It means the binding can’t be reassigned. The array can still be changed.
function demoConstMutation() {
const tasks = [‘code review‘, ‘release notes‘];
tasks.unshift(‘security scan‘);
console.log(tasks);
}
demoConstMutation();
That’s valid JavaScript, because the reference stays the same.
Two quick, practical notes on parameters:
1) unshift() accepts any values, including undefined, null, objects, arrays, and functions. That means you can accidentally prepend undefined if you forget a return statement.
function maybeGetItem(flag) {
if (flag) return { id: 1 };
// no return -> undefined
}
const items = [{ id: 2 }];
items.unshift(maybeGetItem(false));
console.log(items); // [undefined, {id: 2}]
2) When you prepend objects, you’re prepending references, not copies. If you later mutate that object, every array that contains it sees the change.
const a = { label: ‘original‘ };
const list = [‘x‘, ‘y‘];
list.unshift(a);
a.label = ‘changed‘;
console.log(list[0].label); // ‘changed‘
That isn’t a problem with unshift() specifically, but prepending objects is a common place people first “notice” shared references.
Mutation vs Modern State Patterns (React, Stores, and “Immutable Updates”)
If you’re writing UI code in 2026—React, Solid, Svelte, Vue, or a lightweight store—mutating arrays in place is often the fastest route to stale renders or confusing state bugs.
In my experience, the rule that keeps teams sane is:
- Mutate arrays you fully own.
- Avoid mutation for shared state, cached state, or UI state.
When you want “prepend” semantics without mutating the original array, I recommend building a new array:
function prependImmutable(items, newItem) {
return [newItem, ...items];
}
const original = [‘invoice-104‘, ‘invoice-105‘];
const next = prependImmutable(original, ‘invoice-103‘);
console.log(‘original:‘, original);
console.log(‘next:‘, next);
That pattern is extremely readable, and it plays nicely with change detection.
Here’s how I think about the trade in a practical way:
Traditional approach
—
arr.unshift(item)
arr.unshift(item) Manual copy + insert
[item, ...arr] Loop with repeated inserts
[...newItems, ...arr] arr.unshift(...) returns length
newArr.length If you’re using a reducer pattern, the immutable version is typically what you want:
function notificationsReducer(state, action) {
switch (action.type) {
case ‘notification.received‘: {
// Avoid mutating state.notifications
return {
...state,
notifications: [action.payload, ...state.notifications],
};
}
default:
return state;
}
}
When I do choose unshift() in app code, it’s usually in lower-level logic where the array is not shared—like building a result list inside one function.
One nuance I’ve learned the hard way: even when you “own” the array, you might still be in trouble if other code holds a reference to it because you passed it somewhere earlier. Ownership is about references, not intentions.
If you’re in a codebase that cares about immutability, I like to make it obvious in naming:
notifications(shared state) -> avoid mutationmutableNotificationsBuffer(internal) -> mutation is expected
That’s not style-policing; it’s a pragmatic signal to the next person who touches the code.
Performance Reality: Prepending Can Get Expensive
I don’t treat unshift() as “slow” by default. I treat it as “cost grows with array size.”
Because prepending shifts existing elements, the operation is generally O(n). That means:
- For small arrays (dozens or a few hundred items), it’s typically fine.
- For medium arrays (thousands), it can start showing up in profiles if it’s in a hot path.
- For large arrays (tens of thousands or more), repeated
unshift()can create noticeable jank, often landing in the “typically 10–15ms per burst” range once you combine the method cost with rendering and garbage collection.
The bigger trap is repeated prepends in a loop, which can turn into quadratic work:
function buildLogSlow(events) {
const result = [];
for (const event of events) {
// This shifts the entire result array every time.
result.unshift(event);
}
return result;
}
If you actually want reversed order, I recommend a different approach:
function buildLogFast(events) {
// Copy then reverse once.
return [...events].reverse();
}
Or, if you’re constructing from scratch and don’t need to preserve the input:
function buildLogFastestFromScratch(events) {
const result = [];
for (const event of events) {
result.push(event);
}
result.reverse();
return result;
}
If your real requirement is “fast adds on both ends,” an array is the wrong data structure for heavy prepending. In that case, I usually reach for one of these patterns:
- Deque-like structure: a small custom wrapper around an object map with head/tail indexes.
- Chunked buffer: store pages of items and stitch them together for display.
- Two-array trick: keep a
frontarray (newest-first) and abackarray; flatten only when needed.
Here’s a simple deque-like pattern that avoids shifting costs (complete and runnable):
class Deque {
constructor() {
this._store = Object.create(null);
this._head = 0;
this._tail = 0;
}
pushBack(value) {
this.store[this.tail++] = value;
return this.size();
}
pushFront(value) {
this.store[--this.head] = value;
return this.size();
}
popFront() {
if (this.size() === 0) return undefined;
const value = this.store[this.head];
delete this.store[this.head++];
return value;
}
size() {
return this.tail - this.head;
}
toArray() {
const result = [];
for (let i = this.head; i < this.tail; i++) {
result.push(this._store[i]);
}
return result;
}
}
function demoDeque() {
const timeline = new Deque();
timeline.pushBack(‘event-1‘);
timeline.pushBack(‘event-2‘);
timeline.pushFront(‘event-0‘);
console.log(timeline.size());
console.log(timeline.toArray());
}
demoDeque();
I’m not saying “never use unshift().” I’m saying: when you feel tempted to call it thousands of times, switch strategies.
One practical profiling tip: don’t guess. If you suspect unshift() is a bottleneck, wrap the hot path with a quick benchmark in Node or a browser devtools snippet and compare approaches (unshift() in a loop vs push() then reverse() vs batch concat). You don’t need exact numbers; you just want to know if you’re dealing with micro-costs or a real cliff.
Edge Cases You’ll Actually Run Into
unshift() looks simple, but JavaScript arrays have quirks. Here are the edge cases I watch for.
Empty arrays
Prepending into an empty array is completely fine:
function demoEmptyArray() {
const pending = [];
const lengthAfter = pending.unshift(‘job-001‘);
console.log(lengthAfter);
console.log(pending);
}
demoEmptyArray();
Sparse arrays (arrays with holes)
Arrays can be sparse, meaning some indexes don’t exist. unshift() will still move indexes around, and holes remain holes. That’s usually what you want, but it can surprise you if you expect every index to be “filled.”
function demoSparse() {
const readings = [];
readings[2] = 42; // indexes 0 and 1 are holes
readings.unshift(10);
console.log(readings);
console.log(‘0 in readings:‘, 0 in readings);
console.log(‘1 in readings:‘, 1 in readings);
}
demoSparse();
A practical takeaway: if your code relies on “no holes,” use arr.includes(undefined) carefully. A hole is not the same thing as an element with value undefined.
“Array-like” objects
unshift() is generic enough that you can apply it to array-like objects (things with indexed keys and a length). This comes up in legacy code, DOM collections, or custom structures.
function demoArrayLike() {
const arrayLike = { 0: ‘alpha‘, 1: ‘beta‘, length: 2 };
const newLength = Array.prototype.unshift.call(arrayLike, ‘start‘);
console.log(‘newLength:‘, newLength);
console.log(arrayLike);
}
demoArrayLike();
In modern code, I usually convert to a real array instead because it’s clearer:
const realArray = Array.from(document.querySelectorAll(‘button‘));
Frozen arrays
If an array is frozen (Object.freeze()), unshift() will throw in strict mode because it can’t change the object.
function demoFrozen() {
‘use strict‘;
const config = Object.freeze([‘stable‘, ‘safe‘]);
try {
config.unshift(‘new‘);
} catch (err) {
console.log(‘Error:‘, err.name);
}
}
demoFrozen();
When I’m debugging “why did this crash only in production,” frozen state is one of the first things I check.
Non-writable length and exotic arrays
This is rarer, but worth knowing: if length can’t be written (for example, by defining it as non-writable), many array mutations will fail.
function demoNonWritableLength() {
‘use strict‘;
const arr = [‘a‘, ‘b‘];
Object.defineProperty(arr, ‘length‘, { writable: false });
try {
arr.unshift(‘z‘);
} catch (err) {
console.log(‘Error:‘, err.name);
}
}
demoNonWritableLength();
You probably won’t do this on purpose, but libraries sometimes create “exotic” objects, and it’s helpful to recognize the failure mode.
Common Mistakes (and the Fix I Actually Ship)
These are the mistakes I see repeatedly, and how I steer teams away from them.
Mistake 1: Treating the return value as the inserted element
If you want the inserted element, you already have it—because you passed it in. If you want the new array, you already have it—because it was mutated.
Correct pattern:
const auditEvents = [‘login‘, ‘viewed-report‘];
auditEvents.unshift(‘password-changed‘);
console.log(auditEvents[0]);
Mistake 2: Mutating shared arrays in UI state
I see this when someone stores arrays in a global store and prepends directly:
// Risky if subscribers expect immutability
state.notifications.unshift(newNotification);
The fix I recommend is to return a new array reference:
state.notifications = [newNotification, ...state.notifications];
If you’re using a library that enforces immutability, it will catch this for you. If you’re not, your tests should.
Mistake 3: Calling unshift() in a loop to reverse order
This is the stealth performance killer. Prefer reverse() once.
Mistake 4: Confusing unshift() with push() when reading code
I encourage teams to name variables to reflect order: newestFirst, oldestFirst, head, tail. That reduces logic bugs more than any clever trick.
Mistake 5: Prepending “batches” one item at a time
If you have a batch, prepend it as a batch:
function prependBatch(items, newItems) {
return [...newItems, ...items];
}
This is clearer and usually faster than repeated prepends.
Mistake 6: Using unshift() and then relying on old indexes
I’ve seen bugs like this in event lists:
const items = [{ id: ‘a‘ }, { id: ‘b‘ }];
const selectedIndex = 1; // points to ‘b‘
items.unshift({ id: ‘new‘ });
// selectedIndex is still 1, but now it points to ‘a‘
console.log(items[selectedIndex].id);
The fix isn’t “don’t use unshift().” The fix is: don’t treat array indexes as stable IDs. If you need stability, store selectedId and re-find the index, or use a map keyed by ID.
Mistake 7: Accidentally double-prepending in concurrent or async flows
In UI apps, it’s easy to call a prepend path twice when reconnecting websockets or retrying. If you prepend by raw object identity, you get duplicates quickly.
A pattern I like is “prepend with dedupe”:
function prependUniqueById(items, newItem) {
const seen = new Set(items.map((x) => x.id));
if (seen.has(newItem.id)) return items;
return [newItem, ...items];
}
If you need this in a hot path, maintain the Set alongside the array instead of rebuilding it every time.
Practical Scenarios Where unshift() Is a Great Fit
I don’t treat unshift() as a trivia method. It’s useful in real systems—when used deliberately.
Scenario 1: Building a display list from a single-owner array
If a function owns the array and returns it, mutation is often the simplest option.
function buildBreadcrumbs(routeSegments) {
const crumbs = [];
// Start with Home at the front
crumbs.unshift({ label: ‘Home‘, href: ‘/‘ });
let currentPath = ‘‘;
for (const segment of routeSegments) {
currentPath += ‘/‘ + segment;
crumbs.push({ label: segment, href: currentPath });
}
return crumbs;
}
function demoBreadcrumbs() {
const crumbs = buildBreadcrumbs([‘settings‘, ‘billing‘, ‘invoices‘]);
console.log(crumbs);
}
demoBreadcrumbs();
I like this because it’s readable: “Home goes first.”
Scenario 2: “Newest first” logs for small-to-medium volumes
For a few hundred items, it’s perfectly reasonable:
function createAuditLog(maxItems = 200) {
const events = [];
return {
record(event) {
events.unshift({ event, at: new Date().toISOString() });
if (events.length > maxItems) {
events.pop();
}
return events.length;
},
list() {
return [...events];
},
};
}
function demoAuditLog() {
const log = createAuditLog(3);
log.record(‘signed-in‘);
log.record(‘opened-dashboard‘);
log.record(‘exported-data‘);
log.record(‘signed-out‘);
console.log(log.list());
}
demoAuditLog();
Note the pattern: mutate internal state, return copies when exposing it.
Scenario 3: Undo stacks that show the most recent action first
An undo stack is naturally “last action first.” unshift() can represent “top of stack is index 0” (even though many stacks use the end of the array). I only recommend this for modest sizes.
function createUndoHistory(limit = 50) {
const history = [];
return {
push(action) {
history.unshift(action);
if (history.length > limit) history.pop();
},
peek() {
return history[0];
},
pop() {
return history.shift();
},
list() {
return [...history];
},
};
}
If your undo history grows huge or updates extremely frequently, I’d rather store the “top” at the end with push()/pop() to avoid repeated shifts.
Scenario 4: Small priority queues (human-scale, not algorithmic)
If you have “urgent tasks go first,” unshift() can be a clean signal:
function createWorkQueue() {
const q = [];
return {
enqueue(task) {
q.push(task);
},
enqueueUrgent(task) {
q.unshift(task);
},
next() {
return q.shift();
},
snapshot() {
return [...q];
},
};
}
I’m deliberately not calling this a true priority queue. If you need real priorities, use a heap. But for “one urgent lane,” this is often enough.
Key Takeaways and What I’d Do Next
When you prepend with unshift(), you’re choosing mutation plus index-shifting. For small arrays, that choice is often fine and pleasantly direct. For shared state or large collections, it’s a common source of bugs and slowdowns.
Here’s the short checklist I keep in my head:
- If you need to prepend and you own the array (no shared references),
unshift()is perfectly reasonable. - If the array is UI state or shared state, prefer immutable prepend:
[item, ...arr]. - If you’re prepending in a loop, stop and ask if you actually want
push()+reverse()or a batch operation. - If you need fast prepends at scale, use a deque/ring buffer/chunked approach instead of fighting array semantics.
- If you’re debugging weird behavior, verify the return value (it’s the new length) and check whether the array is frozen or shared.
From there, what I do next depends on the codebase:
- If this is application state: I standardize on immutable updates and enforce it with tests (and sometimes freezing in development).
- If this is a hot path: I profile and switch from repeated
unshift()to a better strategy. - If this is a small helper: I keep
unshift()because it’s expressive.
Understanding the “Front of the Array” (Index 0 as a Design Choice)
unshift() is really about one opinionated idea: “the front is index 0.” That’s obvious, but it has consequences.
A lot of data structures choose the other end. Many stacks put the “top” at the end of the array so you can use push()/pop() (both typically cheap). If you choose index 0 as the “top,” you’re committing to unshift()/shift() (both typically involve moving elements).
So I like to make the choice explicit:
- If I need a stack: I use end-of-array semantics (
push()/pop()). - If I need a queue: I often use end-of-array for enqueue and start-of-array for dequeue (
push()/shift()), but only for small queues. - If I need both ends fast: I don’t use a plain array.
That clarity prevents future refactors like: “Why is shift() suddenly slow?” It was always going to be slow once the list grew.
unshift() vs Other Array Methods (What I Reach For and Why)
It helps to compare unshift() with the handful of methods that sound similar in conversation.
push() vs unshift()
push(x)adds to the endunshift(x)adds to the beginning
When I’m building large arrays incrementally, I prefer push() and then fix order at the end (reverse, sort, render newest-first with an index mapping, etc.).
shift() vs unshift()
unshift(x)puts something at the frontshift()removes from the front
They pair naturally, but they both imply “front operations,” and those imply index movement.
splice() as a general insertion tool
splice() can insert at any index, including 0:
const arr = [‘b‘, ‘c‘];
arr.splice(0, 0, ‘a‘);
console.log(arr);
So why use unshift() at all?
unshift()reads like intent: prepend.splice(0, 0, x)reads like mechanics: insert at 0.
I use unshift() when intent matters and I’m not already doing a more complex splice.
concat() and spread for immutable prepends
Immutable prepend is usually one of these:
const next = [newItem, ...items];
// or
const next = [newItem].concat(items);
I personally prefer spread for readability, and concat when I want to avoid spread in extremely large arrays (or when I’m trying to keep a certain style consistent).
TypeScript Notes: unshift() and Readonly Arrays
In TypeScript, unshift() is available on T[] (mutable arrays) but not on readonly T[].
That’s a feature, not a limitation: if you model your state as readonly, TypeScript nudges you toward immutable updates.
type Notification = { id: string; text: string };
function prependImmutable(items: readonly Notification[], item: Notification) {
return [item, ...items];
}
If you find yourself fighting the type system to use unshift(), it’s usually a sign your data is meant to be treated as immutable.
One more TS gotcha: because unshift() returns a number, it’s easy to accidentally widen types by mixing it into expressions. I keep unshift() on its own line in TS-heavy code to avoid confusing inference and to make side effects obvious.
Production Patterns: Bounded Buffers Without Surprises
A lot of “newest first” lists are also bounded. You only want the last N messages/events.
For modest sizes, unshift() + pop() is a solid pattern (you saw it in the audit log example). If you want to make it safer and more reusable, I like wrapping it in a tiny utility.
function createBoundedNewestFirstBuffer(limit) {
const items = [];
return {
add(value) {
items.unshift(value);
if (items.length > limit) items.pop();
return items.length;
},
values() {
return [...items];
},
clear() {
items.length = 0;
},
size() {
return items.length;
},
};
}
function demoBoundedBuffer() {
const buf = createBoundedNewestFirstBuffer(3);
buf.add(‘a‘);
buf.add(‘b‘);
buf.add(‘c‘);
buf.add(‘d‘);
console.log(buf.values());
}
demoBoundedBuffer();
Why wrap it?
- I can enforce copy-on-read.
- I can swap out the internal structure later (deque, ring buffer) without rewriting every callsite.
- I can test it once and trust it.
If you want better performance at larger sizes, the next step is usually a ring buffer.
Ring buffer (fast, bounded, stable)
A ring buffer avoids shifting by writing into a fixed-size array and using an index.
function createRingBuffer(limit) {
const buf = new Array(limit);
let count = 0;
let head = 0; // points to newest
return {
add(value) {
head = (head - 1 + limit) % limit;
buf[head] = value;
count = Math.min(count + 1, limit);
return count;
},
valuesNewestFirst() {
const out = [];
for (let i = 0; i < count; i++) {
out.push(buf[(head + i) % limit]);
}
return out;
},
size() {
return count;
},
};
}
This is the kind of upgrade I reach for when unshift() is conceptually correct but operationally too expensive.
Practical UI Scenario: Chat Messages Without Re-Rendering the World
Chat is where I see unshift() decisions become architectural.
- Incoming messages arrive in time order.
- Many chat UIs show newest at the bottom, but some show newest at the top.
- Infinite scroll often loads older messages when you scroll up.
If you’re building a newest-first list, you might prepend new messages. That’s fine until your message list is large and you’re updating frequently.
What I recommend in practice:
1) Keep your canonical storage in a stable order (often oldest-first).
2) Derive the display order with a cheap view (reverse copy, virtualized list, or index mapping).
3) Only use unshift() if you’ve measured that it’s acceptable.
Here’s a lightweight “store oldest-first, render newest-first” approach:
function addMessageOldestFirst(messages, msg) {
// cheap append
messages.push(msg);
}
function getMessagesNewestFirst(messages) {
// derive view
return [...messages].reverse();
}
Yes, reverse() creates work too, but you’re paying one predictable cost per render, not a shifting cost per message insertion (and you can optimize further with virtualization).
If you truly need newest-first storage (for example, you constantly need the latest 20 at index 0..19), then unshift() can be correct. Just keep it bounded.
Debugging and Testing: Catching Mutation Bugs Early
If a team has been bitten by accidental mutation, I like to make mutation failures loud in tests.
Freeze state in tests (or development)
Freezing arrays and objects turns many silent bugs into immediate errors.
function deepFreeze(obj) {
if (!obj || typeof obj !== ‘object‘) return obj;
Object.freeze(obj);
for (const key of Object.keys(obj)) deepFreeze(obj[key]);
return obj;
}
function reducer(state, action) {
switch (action.type) {
case ‘add‘:
// WRONG: state.items.unshift(action.payload);
return { ...state, items: [action.payload, ...state.items] };
default:
return state;
}
}
const state = deepFreeze({ items: [] });
const next = reducer(state, { type: ‘add‘, payload: ‘x‘ });
console.log(next.items);
I’m not freezing everything in production; I’m using it as a guardrail during development and CI.
Assert reference changes for immutable updates
A simple test idea: when you expect immutability, assert that references change.
function prependImmutable(items, item) {
return [item, ...items];
}
const a = [‘b‘, ‘c‘];
const b = prependImmutable(a, ‘a‘);
console.log(a === b); // false
Make unshift() side effects obvious in code review
This is non-technical, but practical: I prefer unshift() on its own line and I avoid embedding it inside expressions.
Bad (harder to read):
if (arr.unshift(x) > 10) arr.pop();
Better (explicit side effects):
arr.unshift(x);
if (arr.length > 10) arr.pop();
Alternatives Cheat Sheet (When I Swap Out unshift())
When unshift() is the wrong tool, I usually switch to one of these patterns.
1) Immutable prepend
Use when: UI state, shared arrays, reducers.
const next = [item, ...items];
2) Batch prepend
Use when: you receive a page of data and you want it at the front.
const next = [...newItems, ...items];
3) Append + reverse once
Use when: you’re building a reversed list.
const out = [];
for (const x of input) out.push(x);
out.reverse();
4) Deque / ring buffer
Use when: high-frequency prepends or huge lists.
- Deque: general-purpose, flexible size
- Ring buffer: fixed size, very fast
Quick Reference: unshift() Behavior in One Page
- What it does: adds one or more elements to the beginning of an array.
- Mutates: yes, the original array changes.
- Returns: the new length (
number). - Complexity: generally O(n) because existing elements shift.
- Order: preserved for multiple args (
arr.unshift(a, b)results inathenbat the front). - Works on array-likes: yes, via
Array.prototype.unshift.call(obj, ...). - Fails on frozen arrays: throws in strict mode (and generally cannot mutate).
If you remember only one thing: unshift() is not just “insert”; it’s “insert and shift everything.” That’s fine when the array is small or privately owned. It’s painful when the array is large, shared, or in a hot loop.


