My wake‑up call happened on a dashboard that looked fine in a demo but crawled in real use. A few hundred rows, a couple charts, and suddenly clicks felt sticky. I had already moved logic into hooks and split components, yet the UI still lagged. That experience reshaped how I think about React: rendering isn’t slow by default, but unnecessary work adds up fast. Once you see how hooks schedule work, memoize results, and stabilize references, you can shape render cost with the same care you shape state.
I’m going to share the techniques I rely on when I need React apps to feel instant. You’ll see how to cut repeat computation with useMemo, keep stable event handlers with useCallback, and prevent wasted renders with React.memo. I’ll also show how to structure state to avoid needless updates, how to use useTransition and useDeferredValue to keep input responsive, and how to measure the changes without guessing. I’ll use simple analogies and full examples so you can copy, run, and adapt them right away. My goal is to help you build the mental model that lets you choose the right hook at the right time, not just sprinkle them everywhere.
The render model I keep in my head
When React renders, it runs your component function to produce a new tree description. That doesn’t always mean the browser updates the DOM, but it does mean your JavaScript runs. I treat a render like a kitchen line: the chef can re‑plate a dish quickly if the ingredients are ready, but it’s still work. If you call an expensive function on every render or allocate new arrays for props each time, you’re making the chef redo prep that could have been done once.
There are three things I watch:
1) What triggers renders? State updates, parent renders, and context updates are the usual triggers. If a parent renders, children render by default even if nothing meaningful changed.
2) What work happens during render? Any calculation in your component body runs each render. If it’s a large loop, data reshape, or sort, that cost repeats.
3) What work happens after render? Effects and layout work may run, and heavy work there can still block input.
I also remember that React Strict Mode in development intentionally re‑invokes render to find unsafe side effects. If a component feels slow in development, I confirm the same behavior in production builds before changing architecture. That saved me from “fixing” a problem that only existed in dev.
Here’s a quick table I use to decide when to reach for memoization hooks.
Traditional approach
—
Recalculate each render
Inline arrow function
Inline component
This table is not a rulebook. It’s a reminder that the best fixes are usually about removing repeated work rather than adding layers.
useMemo: caching expensive derived values
useMemo stores the result of a function and recomputes only when dependencies change. I treat it like a lunch prep station. If the ingredients didn’t change, I don’t re‑chop them; I just grab the prepped bowl.
The mistake I see most often is wrapping trivial calculations in useMemo. If a function takes microseconds, caching can cost more than redoing it. I reserve useMemo for computations that are heavy, grow with data size, or trigger a lot of garbage collection.
Here’s a full example that you can run. It builds a list from a count, and the list only rebuilds when the count changes. I also keep the function inside useMemo to avoid accidental work.
// index.js
import React, { useMemo, useState } from "react";
import { createRoot } from "react-dom/client";
function InventoryPreview() {
const [quantity, setQuantity] = useState(0);
// Build a long list only when quantity changes
const items = useMemo(() => {
const result = [];
for (let i = 0; i < quantity * 100; i += 1) {
result.push({ id: i, label: Item ${i + 1} });
}
return result;
}, [quantity]);
return (
Quantity: {quantity}
{items.map(item => (
- {item.label}
))}
);
}
const root = createRoot(document.getElementById("root"));
root.render();
Two rules I follow with useMemo:
- The function should be pure. If it reads from the network or updates state, it belongs in an effect, not in memoization.
- The dependency array should match exactly what the calculation uses. I don’t add extra deps “just in case” because that defeats caching.
When you look at useMemo in real apps, you’ll usually pair it with derived data, expensive filtering or sorting, and expensive formatting. If the list is small, I skip it. If the list can hit thousands of rows, I add it before people complain.
useMemo edge cases that bite
There are a few tricky corners that make useMemo less effective than people expect:
- Shallow equality in dependencies. If a dependency is an object that is recreated every render, the memo recomputes every time. That makes useMemo a no‑op.
- Expensive but unstable sources. If a derived value depends on a large object that changes frequently, memoization might help only a little. In those cases, I consider restructuring state or normalizing data so the part I need stays stable.
- Large memo results. useMemo caches the result in memory. If you store a huge array and don’t need it later, you might increase memory usage without a win. I sometimes memoize a light index or a Map instead of the full data set.
When I choose alternatives to useMemo
Sometimes the best optimization isn’t memoization but a different algorithm. If I see a repeated filter or search, I consider pre‑indexing the data once with a Map or Set and using O(1) lookups. If the calculation is too heavy to do on the main thread, I consider moving it to a web worker or precomputing it on the server. Hooks are tools, not excuses to avoid better data structures.
useCallback: stable function references for child components
useCallback does for functions what useMemo does for values: it returns the same function reference as long as dependencies don’t change. I treat it like giving a remote control to a child component. If you give it a new remote every render, even if it does the same thing, the child sees it as new and may re‑render.
This matters most when the child is wrapped in React.memo or when it compares props by identity. If you create a new handler inline, the child sees a new prop every time and re‑renders.
Here’s a complete example that shows a memoized child with a stable click handler. The handler can safely read data because I include it in the dependency array.
// index.js
import React, { useCallback, useState } from "react";
import { createRoot } from "react-dom/client";
const PurchaseButton = React.memo(function PurchaseButton({ onPurchase }) {
console.log("PurchaseButton render");
return ;
});
function Cart() {
const [items, setItems] = useState(["Keyboard", "Mouse"]);
const handlePurchase = useCallback(() => {
// In a real app, send items to an API
alert(Purchasing ${items.length} items);
}, [items]);
return (
Cart
{items.map(item => (
- {item}
))}
);
}
const root = createRoot(document.getElementById("root"));
root.render();
Common mistakes I avoid:
- Empty dependency arrays when the function reads from props or state. That creates stale closures and bugs.
- Overusing useCallback for every handler. If the child isn’t memoized or the handler isn’t passed down, I keep it simple and inline.
I also remind myself that useCallback is a tool for reference stability, not a magic speed boost. It avoids re‑renders caused by function identity changes. If nothing is memoized, it rarely matters.
useCallback in real apps: a pattern that scales
In larger UIs, I often combine useCallback with functional state updates so the callback doesn’t need to depend on the state itself. This keeps the dependency array smaller and reduces handler churn.
function CounterPanel() {
const [count, setCount] = React.useState(0);
// stable because it doesn‘t depend on count directly
const increment = React.useCallback(() => {
setCount(c => c + 1);
}, []);
return ;
}
This isn’t always possible, but when it is, it’s a clean way to stabilize handlers without losing correctness.
React.memo: preventing re‑renders you don’t need
React.memo wraps a function component and skips re‑rendering when props are the same. It’s a simple way to cut repeated work in leaf components. I think of it like a mailing label: if the address hasn’t changed, you don’t need to re‑route the package.
Here’s an example of a list where each item renders a value. The item component is memoized so it only renders when its value changes.
// index.js
import React from "react";
import { createRoot } from "react-dom/client";
const PriceRow = React.memo(function PriceRow({ label, price }) {
console.log(Rendering ${label});
return (
{label}: ${price}
);
});
function PriceList({ prices }) {
return (
{prices.map(item => (
))}
);
}
function App() {
const [tick, setTick] = React.useState(0);
const prices = [
{ id: 1, label: "SSD", price: 129 },
{ id: 2, label: "Monitor", price: 219 },
{ id: 3, label: "Headset", price: 89 }
];
return (
);
}
const root = createRoot(document.getElementById("root"));
root.render();
Two important cautions:
- If you pass new object or array literals each render, React.memo won’t help because the prop identity changes. That’s where useMemo and useCallback pair naturally with React.memo.
- React.memo itself has a cost. It compares props. For tiny components, the comparison can cost more than the render it saves.
When I need extra control, I provide a custom comparison function to React.memo. I reserve this for high‑traffic components like rows in long lists. If I’m comparing many fields, I make sure the comparison is still cheaper than a re‑render.
A safer custom comparison example
Custom comparisons can hide bugs if they ignore a prop that really affects render. I treat them like a sharp tool.
const Row = React.memo(
function Row({ item, isSelected, onSelect }) {
return (
onSelect(item.id)}>
{item.label}
);
},
(prev, next) => {
return (
prev.item.id === next.item.id &&
prev.item.label === next.item.label &&
prev.isSelected === next.isSelected
);
}
);
I only add custom comparisons after I confirm the component is a real hotspot in the profiler. Otherwise, I stick to default shallow props checks.
State design that avoids wasted updates
Many performance problems aren’t about React at all. They’re about how state is shaped. If you store everything in a single object and update one field, you still create a new object that can trigger many downstream renders. I often split state by concern and use local state in leaf components when that keeps the update surface smaller.
There are a few patterns I use repeatedly:
- Split state by update frequency. Fast‑changing UI state (like hover) should not live next to slow‑changing data (like the user profile).
- Prefer derived data over stored data. If you can compute a value from state, compute it, then memoize it if needed. That prevents mismatch bugs.
- Use functional updates to avoid stale values and reduce dependencies.
Here’s a full example that splits state into UI state and data state. The list does not re‑render when only the search input changes because the list component is memoized and receives stable props.
// index.js
import React, { useMemo, useState } from "react";
import { createRoot } from "react-dom/client";
const ProductList = React.memo(function ProductList({ products }) {
console.log("ProductList render");
return (
{products.map(p => (
- {p.name}
))}
);
});
function App() {
const [search, setSearch] = useState("");
const [products] = useState([
{ id: 1, name: "Desk Lamp" },
{ id: 2, name: "Standing Desk" },
{ id: 3, name: "Cable Organizer" }
]);
const filtered = useMemo(() => {
const term = search.trim().toLowerCase();
if (!term) return products;
return products.filter(p => p.name.toLowerCase().includes(term));
}, [products, search]);
return (
<input
value={search}
onChange={e => setSearch(e.target.value)}
placeholder="Search products"
/>
);
}
const root = createRoot(document.getElementById("root"));
root.render();
This pattern looks simple, but it prevents a surprising amount of work. The input can update every keypress, while the list only rerenders if the filtered results change.
State locality: the simplest performance win
If a piece of state affects only one small component, I keep it there. Global state is convenient, but it can cause a render cascade when it changes. A tiny piece of local state can save a lot of unnecessary work.
For example, I’ll keep accordion open/closed state inside the accordion, not at the page level. I’ll keep form input state in the form component rather than in a global store. The smaller the blast radius, the less render cost you pay.
useTransition and useDeferredValue: keep input responsive
Large lists and heavy filtering can block typing even if the computation itself is correct. That’s where concurrent features help. I treat useTransition like an “express lane” for urgent updates. The UI can respond to input immediately while a low‑priority update finishes in the background.
useDeferredValue is similar, but it acts like a soft buffer for derived data. I use it when a derived value should update shortly after the source value, not necessarily on the exact keystroke.
Here’s a simple example that keeps typing smooth while filtering a large list. The list rendering is the deferred part.
// index.js
import React, { useDeferredValue, useMemo, useState } from "react";
import { createRoot } from "react-dom/client";
function App() {
const [query, setQuery] = useState("");
const deferredQuery = useDeferredValue(query);
const items = useMemo(() => {
const base = [];
for (let i = 0; i < 2000; i += 1) {
base.push(Report ${i + 1});
}
return base;
}, []);
const filtered = useMemo(() => {
const q = deferredQuery.trim().toLowerCase();
if (!q) return items;
return items.filter(item => item.toLowerCase().includes(q));
}, [items, deferredQuery]);
const isStale = query !== deferredQuery;
return (
<input
value={query}
onChange={e => setQuery(e.target.value)}
placeholder="Filter reports"
/>
{isStale &&
Updating results…
}
{filtered.map(item => (
- {item}
))}
);
}
const root = createRoot(document.getElementById("root"));
root.render();
I reach for useTransition when I want a state update to be low priority, like updating a chart after a filter change. I reach for useDeferredValue when I want derived values to lag slightly behind input. Both are about keeping input responsive, which is often the first thing users notice.
useTransition in practice
Here’s a practical pattern I use in dashboards: treat the input as urgent, and the expensive view updates as transition work.
import React, { useMemo, useState, useTransition } from "react";
function FilterableTable({ rows }) {
const [query, setQuery] = useState("");
const [isPending, startTransition] = useTransition();
const [filter, setFilter] = useState("");
const onChange = e => {
const value = e.target.value;
setQuery(value); // urgent
startTransition(() => {
setFilter(value); // low priority
});
};
const filtered = useMemo(() => {
const q = filter.trim().toLowerCase();
if (!q) return rows;
return rows.filter(r => r.name.toLowerCase().includes(q));
}, [rows, filter]);
return (
{isPending && Filtering…}
{filtered.map(row => (
{row.name}
))}
);
}
The small UI message is optional, but it helps users understand why the list lags slightly behind the input when the dataset is huge.
useRef: stable values without re‑renders
Not every value belongs in state. If a value doesn’t affect rendering but you still need to persist it between renders, useRef is a safer and cheaper option. I use it for timers, previous values, and mutable state that shouldn’t trigger a re‑render.
function Stopwatch() {
const [time, setTime] = React.useState(0);
const intervalRef = React.useRef(null);
const start = () => {
if (intervalRef.current) return;
intervalRef.current = setInterval(() => setTime(t => t + 1), 1000);
};
const stop = () => {
clearInterval(intervalRef.current);
intervalRef.current = null;
};
return (
{time}s
);
}
This avoids a common mistake: storing the interval ID in state, which would trigger a re‑render on every timer start/stop without any UI benefit.
Context performance: keep the blast radius small
Context is great for avoiding prop drilling, but it can cause broad re‑renders when the context value changes. When I see a context used for frequently changing data, I split it into multiple contexts or store only stable data at the top level.
A simple rule I use: if only one part of the app needs a piece of state, don’t put it in context. If multiple parts need it but it updates frequently, consider a dedicated context just for that slice.
Example pattern:
const UserContext = React.createContext(null);
const ThemeContext = React.createContext("light");
function App() {
const user = useUser(); // changes occasionally
const theme = useTheme(); // changes more often
return (
);
}
Splitting contexts like this can prevent unrelated components from re‑rendering when only the theme changes.
List rendering: windowing beats micro‑optimizations
When you render thousands of rows, memoization helps but it isn’t enough. The best fix is to render fewer rows. List windowing libraries render only what’s visible and a small buffer around it. In my experience, this can turn a 300ms render into 20–60ms on mid‑range devices.
I use windowing even before reaching for complex memoization. It reduces DOM nodes, memory, and layout work. Hooks still matter, but they become secondary once the list is appropriately virtualized.
Lightweight windowing pattern
If you don’t want a full library, you can implement a basic window with a fixed row height. Here’s a simplified version to illustrate the idea:
function VirtualList({ items, rowHeight, height }) {
const [scrollTop, setScrollTop] = React.useState(0);
const totalHeight = items.length * rowHeight;
const startIndex = Math.floor(scrollTop / rowHeight);
const endIndex = Math.min(
items.length,
startIndex + Math.ceil(height / rowHeight) + 5
);
const visible = items.slice(startIndex, endIndex);
return (
<div
style={{ height, overflow: "auto" }}
onScroll={e => setScrollTop(e.currentTarget.scrollTop)}
>
{visible.map((item, i) => {
const top = (startIndex + i) * rowHeight;
return (
{item.label}
);
})}


