React 20: The End of 'useMemo'? Everything You Need to Know About the New Compiler


- Premium Results
- Publish articles on SitePoint
- Daily curated jobs
- Learning Paths
- Discounts to dev tools
7 Day Free Trial. Cancel Anytime.
React 19 ships the React Compiler as an opt-in production feature. That changes how developers approach performance optimization. The React Compiler, an evolution of the internal Meta project once called React Forget, performs automatic memoization at build time. useMemo, useCallback, and React.memo—three of the most debated and frequently misused APIs in the React ecosystem—are no longer daily necessities for most applications. The compiler analyzes component code statically and inserts cache boundaries per reactive scope rather than per hook call, producing more granular memoization than most developers write by hand. (Inspect compiled output to compare slot count against your manual hook count.) It reached opt-in production status with React 19, and the React team recommends incremental adoption. For advanced React developers, working effectively with it requires understanding the compiler's static analysis model, its bail-out conditions, and migration mechanics.
Table of Contents
- How React Re-rendering Works Today (and Why It's a Problem)
- What Is the React Compiler?
- Automatic Memoization in Practice
- Setting Up the React Compiler in Your Project
- What the Compiler Cannot Do (and When You Still Need Manual Optimization)
- Migration Strategy for Existing Codebases
- Real-World Performance Impact
- What This Means for the Future of React Development
How React Re-rendering Works Today (and Why It's a Problem)
React's reconciliation model has a well-known cascade problem. When a parent component's state changes, React re-renders that component and every child in its subtree by default, regardless of whether the child's props actually changed. This is by design: React prioritizes correctness and predictability over performance. But in complex applications with deeply nested trees, large lists, or frequent state updates, the cost adds up fast (think >16ms per frame, visible as jank in DevTools Performance traces).
The escape hatches—useMemo for computed values, useCallback for stable function references, and React.memo for component-level shallow comparison—exist specifically to break this cascade. They let developers tell React "skip this work if the inputs haven't changed." The problem is threefold: they add significant verbosity, they require developers to manually specify dependency arrays (which are easy to get wrong), and misapplying them can actually hurt performance by adding overhead without benefit.
Consider a straightforward component tree where a parent holds a counter and renders a child that iterates a large dataset:
import { useState } from 'react';
// ⚠️ DO NOT COPY — intentionally unoptimized to demonstrate the problem
// No memoization — child re-renders on every parent state change
function Parent() {
const [count, setCount] = useState(0);
const [text, setText] = useState('');
// Called on every render, even when only `count` changed
const items = generateExpensiveList(text);
const handleClick = () => setCount(c => c + 1);
return (
<div>
<button onClick={handleClick}>Count: {count}</button>
<ExpensiveChild items={items} onAction={() => console.log('action')} />
</div>
);
}
function ExpensiveChild({ items, onAction }) {
return (
<ul>
{items.map(item => (
<li key={item.id} onClick={onAction}>{item.name}</li>
))}
</ul>
);
}
Every time count changes, ExpensiveChild re-renders. The items array is recomputed, and the inline onAction arrow function creates a new reference each render. Now look at the manually optimized version:
import { useState, useMemo, useCallback, memo } from 'react';
// Manual memoization — verbose, fragile dependency arrays
function Parent() {
const [count, setCount] = useState(0);
const [text, setText] = useState('');
const items = useMemo(() => generateExpensiveList(text), [text]);
const handleClick = useCallback(() => setCount(c => c + 1), []);
const handleAction = useCallback(() => console.log('action'), []);
return (
<div>
<button onClick={handleClick}>Count: {count}</button>
<MemoizedChild items={items} onAction={handleAction} />
</div>
);
}
const MemoizedChild = memo(function ExpensiveChild({ items, onAction }) {
return (
<ul>
{items.map(item => (
<li key={item.id} onClick={onAction}>{item.name}</li>
))}
</ul>
);
});
Three hooks, a wrapper HOC, and three dependency arrays, all to express something the developer already intuitively knew: "don't redo this work if the inputs haven't changed." The React Compiler aims to make this entire layer unnecessary.
Three hooks, a wrapper HOC, and three dependency arrays, all to express something the developer already intuitively knew: "don't redo this work if the inputs haven't changed."
What Is the React Compiler?
From React Forget to a Production Compiler
The project began as React Forget, an internal Meta initiative to auto-memoize React components. The React team developed it over several years, then shipped it experimentally in React 19 under the react-compiler package. It reached opt-in production status in React 19, with the team recommending incremental adoption. Unlike runtime optimizations such as concurrent features or Suspense, the compiler's transformation operates entirely at build time. It is an ahead-of-time (AOT) transform: the React team implemented it as a Babel plugin and integrated it into bundler pipelines for Vite, Next.js (via both Webpack and Turbopack—verify Turbopack support status separately), and Webpack.
The distinction matters. Runtime optimizations add overhead during execution to skip unnecessary work. The compiler, by contrast, rewrites your component code before it ever reaches the browser, inserting caching logic that adds only lightweight per-slot equality checks at runtime, with no scheduling or virtual DOM diffing overhead.
How the Compiler Analyzes Your Code
The compiler performs static analysis on every component and hook body. It parses the code to identify values, their dependencies, and what it calls "reactive scopes"—blocks of code whose outputs depend on specific reactive inputs (props, state, context values). Within each reactive scope, the compiler tracks which values flow into which computations and which computations flow into JSX output.
This analysis depends on components following the Rules of React: renders must be idempotent, side effects must not occur during render, and props and state must be treated as immutable. The compiler enforces these rules statically. If it encounters code that violates them (mutating a prop, for example, or calling a non-deterministic function during render), it bails out of optimization for that specific scope rather than producing incorrect code.
What the Compiler Outputs
The compiled output replaces reactive scopes with cached slots and conditional equality checks. The compiler assigns each memoization-worthy value a slot in a per-component cache array. On subsequent renders, the compiled code checks whether inputs to that slot have changed; if not, it returns the cached value.
The _useMemoCache function shown below is an internal React runtime API. It initializes every cache slot to a special sentinel value (Symbol.for("react.memo_cache_sentinel")) that is guaranteed to never be === to any valid prop, state, or computed value, ensuring the first render always takes the "miss" path. The function is shown here for explanatory purposes only and should never be called directly in application code.
Here is the core transformation. Given this clean source component:
// Source — no memoization hooks, users passed as prop for compiler tracking
function UserList({ users, filter, onSelect }) {
const filtered = users.filter(u => u.name.includes(filter));
const sortedUsers = [...filtered].sort((a, b) =>
a.name.localeCompare(b.name)
);
return (
<ul>
{sortedUsers.map(user => (
<li key={user.id} onClick={() => onSelect(user)}>{user.name}</li>
))}
</ul>
);
}
The compiler produces something conceptually equivalent to:
// Simplified compiled output — auto-generated cache slots
function UserList({ users, filter, onSelect }) {
const $ = _useMemoCache(5); // 5 cache slots allocated (indices 0–4)
let filtered, sortedUsers;
if ($[0] !== filter) { // Slot 0: tracks 'filter' input
filtered = users.filter(u => u.name.includes(filter));
sortedUsers = [...filtered].sort((a, b) => a.name.localeCompare(b.name));
$[0] = filter;
$[1] = sortedUsers; // Slot 1: caches computed list
} else {
sortedUsers = $[1];
}
let jsx;
if ($[2] !== sortedUsers || $[3] !== onSelect) { // Slots 2-3: track JSX inputs
jsx = (
<ul>
{sortedUsers.map(user => (
<li key={user.id} onClick={() => onSelect(user)}>{user.name}</li>
))}
</ul>
);
$[2] = sortedUsers;
$[3] = onSelect;
$[4] = jsx; // Slot 4: cache JSX on miss only
} else {
jsx = $[4]; // Slot 4: return cached JSX on hit
}
return jsx;
}
Notice the granularity. The compiler creates separate cache boundaries for the data computation and the JSX output. It tracks filter and onSelect independently. A manual useMemo call would typically memoize either the whole computation or the whole component, but rarely at this level of precision.
This side-by-side transformation is the key visual asset. The left panel shows clean source code with zero optimization hooks. The right panel shows the compiled output with annotated cache slots: the inline object gets a cache slot, callbacks are automatically memoized, and the JSX subtree is wrapped in a conditional re-render check. At the bottom, the same logic written manually with useMemo/useCallback/React.memo demonstrates the boilerplate the compiler eliminates.
Automatic Memoization in Practice
What Gets Memoized Automatically
The compiler memoizes:
- component return values (JSX elements)
- inline objects and arrays passed as props
- callback functions passed as props
- computed values inside component bodies
- hook return values
Anything that creates a new reference on each render and feeds into a child component or a subsequent computation is a candidate for automatic caching.
Replacing useMemo and useCallback
Consider a data-heavy component that filters and sorts a product list:
import { useMemo } from 'react';
// Before the compiler — manual memoization
function ProductTable({ products, searchTerm, sortKey }) {
const filtered = useMemo(
() => products.filter(p => p.name.toLowerCase().includes(searchTerm.toLowerCase())),
[products, searchTerm]
);
const sorted = useMemo(
() => [...filtered].sort((a, b) => {
if (a[sortKey] < b[sortKey]) return -1;
if (a[sortKey] > b[sortKey]) return 1;
return 0;
}),
[filtered, sortKey]
);
return <Table data={sorted} />;
}
With the React Compiler, the same component needs no hooks:
// With the React Compiler — compiler handles memoization
function ProductTable({ products, searchTerm, sortKey }) {
const filtered = products.filter(p =>
p.name.toLowerCase().includes(searchTerm.toLowerCase())
);
const sorted = [...filtered].sort((a, b) => {
if (a[sortKey] < b[sortKey]) return -1;
if (a[sortKey] > b[sortKey]) return 1;
return 0;
});
return <Table data={sorted} />;
}
The compiler analyzes that filtered depends on products and searchTerm, and sorted depends on filtered and sortKey, then generates cache slots for each. The code is cleaner, and the memoization boundaries are more granular: the compiler generates one cache slot per dependency-tracked expression, which typically exceeds the granularity of a single useMemo wrapping an entire block.
For callbacks, the pattern is similar. A parent passing handlers to a list of children no longer needs useCallback wrapping:
// With the React Compiler — no useCallback, no React.memo needed on child
// The compiler memoizes the inline arrow per task when task identity is stable.
// Verify via React DevTools "Memo ✨" badge on each TaskItem.
function TaskList({ tasks, onComplete }) {
return (
<ul>
{tasks.map(task => (
<TaskItem
key={task.id}
task={task}
onComplete={() => onComplete(task.id)}
/>
))}
</ul>
);
}
function TaskItem({ task, onComplete }) {
return <li onClick={onComplete}>{task.title}</li>;
}
The compiler memoizes each TaskItem's props, including the inline arrow function, and wraps the child rendering in conditional checks. React.memo on TaskItem becomes redundant. Note that the compiler's per-iteration memoization of the inline arrow relies on task object identity being stable across renders for each list item. If the tasks array is reconstructed on every render (e.g., from an unmemoized selector), all per-task callbacks will be invalidated regardless of compiler optimization.
Replacing React.memo
The compiler makes React.memo wrapper components unnecessary for the standard case of shallow prop comparison. One nuance worth flagging: React.memo with a custom comparator function (the second argument) implements logic that goes beyond referential equality. The compiler does not replicate custom comparison logic. If a component relies on a custom comparator that, say, deep-compares specific nested props, that React.memo wrapper should remain in place. Similarly, React.memo used with library-provided comparators (e.g., shallowEqual from react-redux) must be retained.
Setting Up the React Compiler in Your Project
Prerequisites and Compatibility
The compiler requires React 19. If you're on Node 16, upgrade first—Node.js 18 or later is required. Build tool support covers Vite (via @vitejs/plugin-react), Next.js 15+ (experimental opt-in via experimental.reactCompiler: true; not enabled by default), and Webpack with Babel (via babel-plugin-react-compiler). TypeScript is fully supported. An ESLint plugin, eslint-plugin-react-compiler, validates that component code follows the Rules of React and flags patterns the compiler cannot optimize.
Installation and Configuration
For a Vite project:
npm install react@19 react-dom@19 react-compiler-runtime
npm install -D babel-plugin-react-compiler eslint-plugin-react-compiler
Verify the react-compiler-runtime package name and publisher on npmjs.com before installing. Confirm the publisher belongs to the React/Meta organization. Pin to a specific version matching your React 19 release (e.g., [email protected]) to avoid installing an incorrect or squatted package.
babel-plugin-react-compiler is a build-time tool and belongs in devDependencies for most SPA setups. However, if your deployment pipeline runs npm install --production before the build step (common in some SSR or edge-function configurations), you may need to move it to dependencies to ensure it is available during bundling.
// vite.config.ts
import { defineConfig } from 'vite';
import react from '@vitejs/plugin-react';
export default defineConfig({
plugins: [
react({
babel: {
plugins: [
['babel-plugin-react-compiler', { target: '19' }],
],
},
}),
],
});
For Next.js:
// next.config.js
// Note: Next.js 14+ also supports next.config.mjs (ESM format).
// CJS is shown here for broad compatibility.
module.exports = {
experimental: {
reactCompiler: true,
},
};
ESLint setup (legacy .eslintrc.js format for ESLint 8 and earlier):
// .eslintrc.js (ESLint 8 and earlier — legacy format)
module.exports = {
plugins: ['react-compiler'],
rules: {
'react-compiler/react-compiler': 'error',
},
};
ESLint 9+ uses the flat config format. If your project uses ESLint 9+, use the following instead:
// eslint.config.js (ESLint 9+ flat config)
import reactCompiler from 'eslint-plugin-react-compiler';
export default [
{
plugins: { 'react-compiler': reactCompiler },
rules: { 'react-compiler/react-compiler': 'error' },
},
];
For incremental adoption, the compiler supports annotation mode via the compilationMode: 'annotation' config option; individual files can be opted in using a directive (verify the current directive syntax—e.g., 'use memo'—in the babel-plugin-react-compiler README, as this has changed across releases; the earlier 'use forget' directive may no longer be recognized). An opt-out mode is also available where specific files can be excluded.
Validating the Compiler Is Working
React DevTools displays a "Memo ✨" badge on components that the compiler has auto-memoized (verify exact badge appearance against your installed React DevTools version). The compiler also emits diagnostics during the build: warnings when it skips a component, and info-level messages about the cache slots it creates. Inspecting the compiled output directly (via source maps or build output) confirms the transformation is applied.
What the Compiler Cannot Do (and When You Still Need Manual Optimization)
Code Patterns the Compiler Skips
The compiler bails out on components or hooks that violate the Rules of React. Mutating objects during render, reading from mutable external stores without useSyncExternalStore, non-idempotent render logic (e.g., Math.random() inline in JSX), and dynamic property access patterns that prevent static dependency tracking all cause the compiler to skip optimization for the affected scope.
Without eslint-plugin-react-compiler or build diagnostics enabled, bail-outs are silent; there is no runtime warning. This makes the ESLint plugin the primary tool for catching unoptimized components during development.
Here is an example of a pattern that triggers a bail-out:
// Compiler skips this — mutation during render
function BadComponent({ data }) {
const result = data;
result.processed = true; // ❌ Mutating props during render
return <Display data={result} />;
}
// Compiler warning: "Mutating component props or state during render
// is not compatible with React Compiler optimizations."
// Fix: create a new object (null-safe)
function FixedComponent({ data }) {
const result = { ...(data ?? {}), processed: true };
return <Display data={result} />;
}
Third-party libraries with opaque internal patterns or non-standard hook implementations can also confuse the analyzer.
When useMemo Still Has a Role
There are edge cases where keeping useMemo makes sense. Custom comparator logic that goes beyond referential equality is one. Another is when useMemo serves as a semantic signal to teammates that a computation is intentionally costly (e.g., runs an O(n log n) sort over thousands of items) and should not be casually refactored. For computations where developers want an explicit guarantee visible in the source code, keeping useMemo is a reasonable choice, though functionally redundant when the compiler is active.
The compiler bails out on components or hooks that violate the Rules of React. Mutating objects during render, reading from mutable external stores without
useSyncExternalStore, non-idempotent render logic, and dynamic property access patterns that prevent static dependency tracking all cause the compiler to skip optimization for the affected scope.
Migration Strategy for Existing Codebases
Incremental Adoption Path
Start by adding eslint-plugin-react-compiler and running it across the codebase. This surfaces any rule violations that would cause the compiler to bail out. Fix those violations first. Then enable the compiler in opt-in mode, targeting a few non-critical files. Measure performance using the React DevTools Profiler and expand gradually.
Should You Remove Existing useMemo and useCallback Calls?
When the compiler encounters existing useMemo or useCallback calls, it respects them. The practical effect is that both the manual hook and the compiler's auto-generated caching may apply, double-memoizing the same value. Double-memoizing doesn't break semantics but adds marginal per-render overhead; prioritize cleanup in hot-path components. The recommended approach: leave existing hooks in place during the initial migration to avoid introducing regressions, then remove them in a dedicated cleanup pass. Codemod tooling can automate the removal of useMemo, useCallback, and React.memo wrappers that the compiler now handles.
Performance Benchmarking Before and After
Measure re-render counts via React DevTools Profiler, interaction-to-next-paint (INP) via Chrome's Performance panel, and bundle size impact. The compiler adds per-component overhead from cache arrays—measure the delta with source-map-explorer or a before/after bundle diff—offset by removing manual hook imports. Realistic expectations: the biggest gains appear in applications with lists exceeding ~500 items, deeply nested component trees (10+ levels), and frequent state updates. Applications that were already aggressively hand-optimized may see modest differences.
Real-World Performance Impact
What Meta's Internal Data Shows
Meta has deployed the React Compiler across Instagram and Facebook surfaces in production. The React team has reported improvements in INP and component render times. Consult the React blog for specific case studies and published metrics; for your own application, compare INP before and after using Chrome's Performance panel. In several cases, the compiler's granular, per-scope memoization outperformed hand-tuned optimization because it inserted cache boundaries at points developers would not have bothered with manually.
Community Benchmarks and Early Adopter Reports
Early adopters report the largest gains in scenarios involving lists exceeding ~500 items or trees deeper than ~10 levels, based on early adopter reports and GitHub discussions—benchmark your own application to confirm. Components with frequent state updates driving cascading re-renders also benefit significantly. Applications with relatively flat component structures and infrequent updates see less impact, as expected.
What This Means for the Future of React Development
The React Compiler shifts the framework from a "developer-optimized" model, where performance depends on individual engineers making correct memoization decisions, to a "compiler-optimized" model where the toolchain handles it. This trajectory mirrors what Svelte (which compiles templates ahead of time) has pursued and, to a lesser extent, Solid (which uses fine-grained runtime reactivity rather than a compiler): treating the framework as a compiled language rather than a pure runtime library.
The React Compiler shifts the framework from a "developer-optimized" model, where performance depends on individual engineers making correct memoization decisions, to a "compiler-optimized" model where the toolchain handles it.
For developers reading this now: install eslint-plugin-react-compiler today and audit rule violations. Experiment with the compiler on a non-critical path. Begin planning a phased migration. useMemo is not dead. The compiler respects it, and edge cases remain where it's appropriate. But its role as a daily necessity in React development is ending.