Learn Web Workers in JavaScript for running code in background threads. Understand postMessage, Dedicated and Shared Workers, and transferable objects.
Ever clicked a button and watched your entire page freeze? Tried to scroll while a script was running and nothing happened?
Copy
Ask AI
// This will freeze your entire page for ~5 secondsfunction heavyCalculation() { const start = Date.now() while (Date.now() - start < 5000) { // Simulating heavy work } return 'Done!'}document.getElementById('btn').addEventListener('click', () => { console.log('Starting...') const result = heavyCalculation() // Page freezes here console.log(result)})// During those 5 seconds:// - Can't click anything// - Can't scroll// - Animations stop// - The page looks broken
That’s JavaScript’s single thread at work. But there’s a way out: Web Workers. Defined in the WHATWG HTML Living Standard, they let you run JavaScript in background threads, keeping your UI smooth while crunching numbers, parsing data, or processing images. According to Can I Use data, Web Workers have over 98% browser support across all modern browsers.
How Web Workers provide true parallelism (not just concurrency)
Creating workers and communicating with postMessage
The difference between Dedicated, Shared, and Service Workers
Transferable objects for moving large data without copying
OffscreenCanvas for graphics processing in workers
Real-world patterns: worker pools, inline workers, heavy computations
Prerequisites: This guide builds on the Event Loop and async/await. Understanding those concepts will help you see why Web Workers solve problems that async code can’t.
You might think: “I already know async JavaScript. Doesn’t that solve the freezing problem?”Not quite. Here’s the thing everyone gets wrong about async: async JavaScript is still single-threaded. It’s concurrent, not parallel. As explained in the ECMAScript specification, the language runtime uses a single execution thread — async operations yield control but never run JavaScript code simultaneously on the main thread.
Copy
Ask AI
// Async code is NOT running at the same timeasync function fetchData() { console.log('1: Starting fetch') const response = await fetch('/api/data') // Waits, but doesn't block console.log('3: Got response') return response.json()}console.log('0: Before fetch')fetchData()console.log('2: After fetch call')// Output:// 0: Before fetch// 1: Starting fetch// 2: After fetch call// 3: Got response (later)
The await lets other code run while waiting for the network. But here’s the catch: the actual JavaScript execution is still one thing at a time.
Async works great for I/O operations (network requests, file reads) because you’re waiting for something external. But what about CPU-bound tasks?
Copy
Ask AI
// This async function STILL freezes the pageasync function processLargeArray(data) { const results = [] // This loop is synchronous JavaScript // The "async" keyword doesn't help here! for (let i = 0; i < data.length; i++) { results.push(expensiveCalculation(data[i])) } return results}// The page freezes during the loop// async/await only helps with WAITING, not COMPUTING
Copy
Ask AI
┌─────────────────────────────────────────────────────────────────────────┐│ ASYNC VS PARALLEL: THE DIFFERENCE │├─────────────────────────────────────────────────────────────────────────┤│ ││ ASYNC (Concurrency) PARALLEL (Web Workers) ││ ──────────────────── ───────────────────── ││ ││ Main Thread Main Thread Worker Thread ││ ┌─────────────────┐ ┌──────────┐ ┌──────────┐ ││ │ Task A │ │ Task A │ │ Task B │ ││ │ (work) │ │ (work) │ │ (work) │ ││ ├─────────────────┤ │ │ │ │ ││ │ Wait for I/O... │ ← yields │ │ │ │ ││ ├─────────────────┤ │ │ │ │ ││ │ Task B │ │ │ │ │ ││ │ (work) │ │ │ │ │ ││ ├─────────────────┤ └──────────┘ └──────────┘ ││ │ Task A resumed │ ││ └─────────────────┘ Both run at the SAME TIME ││ on different CPU cores ││ One thread, tasks take turns ││ ││ GOOD FOR: Network requests, GOOD FOR: Heavy calculations, ││ file reads, timers image processing, data parsing ││ │└─────────────────────────────────────────────────────────────────────────┘
The Rule: Use async/await when you’re waiting for something. Use Web Workers when you’re computing something heavy.
If you’ve read our Event Loop guide, you know JavaScript is like a restaurant with a single chef. The chef can only cook one dish at a time, but clever scheduling (the event loop) keeps things moving.Web Workers are like hiring more chefs.
Copy
Ask AI
┌─────────────────────────────────────────────────────────────────────────┐│ THE MULTI-CHEF KITCHEN (WEB WORKERS) │├─────────────────────────────────────────────────────────────────────────┤│ ││ MAIN KITCHEN (Main Thread) PREP KITCHEN (Worker Thread) ││ ┌─────────────────────────┐ ┌─────────────────────────┐ ││ │ │ │ │ ││ │ HEAD CHEF │ │ PREP CHEF │ ││ │ ┌─────────┐ │ │ ┌─────────┐ │ ││ │ │ ^_^ │ │ │ │ ^_^ │ │ ││ │ └─────────┘ │ │ └─────────┘ │ ││ │ │ │ │ ││ │ • Takes customer │ │ • Chops vegetables │ ││ │ orders (events) │ │ • Preps ingredients │ ││ │ • Plates dishes (UI) │ │ • Heavy work │ ││ │ • Talks to customers │ │ • No customer contact │ ││ │ (DOM access) │ │ (no DOM!) │ ││ │ │ │ │ ││ └───────────┬─────────────┘ └───────────┬─────────────┘ ││ │ │ ││ │ ┌──────────────────┐ │ ││ │ │ SERVICE WINDOW │ │ ││ └─────►│ (postMessage) │◄─────────┘ ││ │ │ ││ │ "Need 50 onions │ ││ │ chopped!" │ ││ │ │ ││ │ "Here they are!"│ ││ └──────────────────┘ ││ ││ KEY RULES: ││ • Chefs can't share cutting boards (no shared memory by default) ││ • They communicate through the service window (postMessage) ││ • Prep chef can't talk to customers (workers can't touch the DOM) ││ • Prep chef has their own tools (workers have their own global scope) ││ │└─────────────────────────────────────────────────────────────────────────┘
Kitchen
JavaScript
Head Chef
Main thread (handles UI, events, DOM)
Prep Chef
Web Worker (handles heavy computation)
Service Window
postMessage() / onmessage (communication)
Cutting Board
Memory (each chef has their own)
Customers
Users interacting with the page
Kitchen Rules
Worker limitations (no DOM access)
The prep chef works independently in their own kitchen. They can’t talk to customers (no DOM access), but they can do heavy prep work without slowing down the head chef. When they’re done, they pass the result through the service window.
A Web Worker is a JavaScript script that runs in a background thread, separate from the main thread. It has its own global scope, its own event loop, and executes truly in parallel with your main code. Workers communicate with the main thread through message passing using postMessage() and onmessage. This lets you run expensive computations without freezing the UI.Here’s a basic example:
Copy
Ask AI
// main.js - runs on the main threadconst worker = new Worker('worker.js')// Send data to the workerworker.postMessage({ numbers: [1, 2, 3, 4, 5] })// Receive results from the workerworker.onmessage = (event) => { console.log('Result from worker:', event.data)}
Copy
Ask AI
// worker.js - runs in a separate threadself.onmessage = (event) => { const { numbers } = event.data // Do heavy computation (won't freeze the UI!) const sum = numbers.reduce((a, b) => a + b, 0) // Send result back to main thread self.postMessage({ sum })}
Inside a worker, self refers to the worker’s global scope (a DedicatedWorkerGlobalScope). You can also use this at the top level, but self is clearer.
Workers and the main thread communicate through messages. They can’t directly access each other’s variables. This is intentional: it prevents the race conditions and bugs that plague traditional multi-threaded programming.
// worker.js (module style)import { processData } from './utils.js' // Standard ES modules!self.onmessage = (event) => { const { task, data } = event.data if (task === 'process') { const result = processData(data) self.postMessage(result) }}
Use module workers whenever possible. They support import/export, have strict mode by default, and work better with modern tooling. Check browser support before using in production.
When you send data via postMessage, it’s copied using the structured clone algorithm. This is deeper than JSON.stringify: it handles more types, preserves object references within the data, and even supports circular references.
You can use the global structuredClone() function to deep-clone objects using the same algorithm. This is useful for copying complex data outside of worker communication:
Copy
Ask AI
const original = { name: 'Alice', date: new Date(), nested: { deep: true }}// Deep clone with structuredClone (handles Date, Map, Set, etc.)const clone = structuredClone(original)clone.name = 'Bob'console.log(original.name) // 'Alice' (unchanged)console.log(clone.date instanceof Date) // true (Date preserved!)
Copy
Ask AI
// main.jsconst data = { name: 'Alice', scores: [95, 87, 92], metadata: { date: new Date(), pattern: /test/gi }}worker.postMessage(data)// The worker receives a COPY of this object// Modifying it in the worker won't affect the original
Error cloning: Only standard error types can be cloned (Error, EvalError, RangeError, ReferenceError, SyntaxError, TypeError, URIError). The name and message properties are preserved, and browsers may also preserve stack and cause.
Copy
Ask AI
// ✓ These workworker.postMessage({ text: 'hello', numbers: [1, 2, 3], date: new Date(), regex: /pattern/g, binary: new Uint8Array([1, 2, 3]), map: new Map([['a', 1], ['b', 2]])})// ❌ These will throw errorsworker.postMessage({ fn: () => console.log('hi'), // Functions can't be cloned element: document.body, // DOM nodes can't be cloned sym: Symbol('test') // Symbols can't be cloned})
Performance trap: Structured cloning can be slow for large objects. If you’re passing megabytes of data, consider using Transferable objects instead (covered below).
Copying large amounts of data between threads is slow. For big ArrayBuffers, images, or binary data, use transferable objects to move data instead of copying it.
// main.js// Creating a 100MB bufferconst hugeBuffer = new ArrayBuffer(100 * 1024 * 1024)const array = new Uint8Array(hugeBuffer)// Fill it with datafor (let i = 0; i < array.length; i++) { array[i] = i % 256}console.time('copy')worker.postMessage(hugeBuffer) // This COPIES 100MB - slow!console.timeEnd('copy') // Could take hundreds of milliseconds
Instead of copying, you can transfer the buffer to the worker. The transfer is nearly instant, but the original becomes unusable:
Copy
Ask AI
// main.jsconst hugeBuffer = new ArrayBuffer(100 * 1024 * 1024)const array = new Uint8Array(hugeBuffer)// Fill with data...console.time('transfer')// Second argument is an array of objects to transferworker.postMessage(hugeBuffer, [hugeBuffer])console.timeEnd('transfer') // Nearly instant!// WARNING: hugeBuffer is now "detached" (unusable)console.log(hugeBuffer.byteLength) // 0console.log(array.length) // 0
Copy
Ask AI
// worker.jsself.onmessage = (event) => { const buffer = event.data console.log(buffer.byteLength) // 104857600 (100MB) // Process the data... const array = new Uint8Array(buffer) // Transfer it back when done self.postMessage(buffer, [buffer])}
This table shows the most commonly used transferable objects. For a complete list including newer APIs like MediaStreamTrack and WebTransportSendStream, see MDN’s Transferable objects documentation.
Copy
Ask AI
┌─────────────────────────────────────────────────────────────────────────┐│ COPY VS TRANSFER │├─────────────────────────────────────────────────────────────────────────┤│ ││ COPY (Default) TRANSFER ││ ───────────── ──────── ││ ││ Main Thread Worker Thread Main Thread Worker Thread ││ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ ││ │ [data] │ │ │ │ [data] │ │ │ ││ │ 100MB │ │ │ │ 100MB │ │ │ ││ └────┬────┘ └─────────┘ └────┬────┘ └─────────┘ ││ │ │ ││ │ copy │ move ││ ▼ ▼ ││ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ ││ │ [data] │ │ [data] │ │ [empty] │ │ [data] │ ││ │ 100MB │ │ 100MB │ │ 0MB │ │ 100MB │ ││ └─────────┘ └─────────┘ └─────────┘ └─────────┘ ││ ││ • Slow (copies bytes) • Fast (moves pointer) ││ • Both have the data • Only one has the data ││ • Memory doubled • Memory unchanged ││ │└─────────────────────────────────────────────────────────────────────────┘
Rule of thumb: Transfer when the data is large (> 1MB) and you don’t need to keep it in the sending context. Copy when the data is small or you need it in both places.
Shared Workers can be accessed by multiple scripts, even across different browser tabs or iframes (as long as they’re from the same origin).
Copy
Ask AI
// main.js (Tab 1)const worker = new SharedWorker('shared-worker.js')worker.port.onmessage = (event) => { console.log('Received:', event.data)}worker.port.postMessage('Hello from Tab 1')
Copy
Ask AI
// main.js (Tab 2) - connects to the SAME workerconst worker = new SharedWorker('shared-worker.js')worker.port.onmessage = (event) => { console.log('Received:', event.data)}worker.port.postMessage('Hello from Tab 2')
Shared Workers have limited browser support. They work in Chrome, Firefox, Edge, and Safari 16+, but are not supported on Android browsers (Chrome for Android, Samsung Internet). Check caniuse.com before using in production.
Service Workers are a special type of worker designed for a different purpose: they act as a proxy between your web app and the network. They enable offline functionality, push notifications, and background sync.
Copy
Ask AI
// Registering a service worker (in main.js)if ('serviceWorker' in navigator) { navigator.serviceWorker.register('/sw.js') .then(registration => { console.log('SW registered:', registration) }) .catch(error => { console.log('SW registration failed:', error) })}
Service Workers are a deep topic with their own complexities around lifecycle, caching strategies, and updates. They deserve their own dedicated guide. For now, just know they exist and are different from Web Workers.
Normally, canvas operations happen on the main thread. With OffscreenCanvas, you can move rendering to a worker, keeping the main thread free for user interactions.
// main.jsconst canvas = document.getElementById('myCanvas')// Transfer control to an OffscreenCanvasconst offscreen = canvas.transferControlToOffscreen()const worker = new Worker('canvas-worker.js', { type: 'module' })// Transfer the canvas to the workerworker.postMessage({ canvas: offscreen }, [offscreen])
Copy
Ask AI
// canvas-worker.jslet ctxself.onmessage = (event) => { if (event.data.canvas) { const canvas = event.data.canvas ctx = canvas.getContext('2d') // Start animation loop in the worker animate() }}function animate() { // Clear canvas ctx.fillStyle = '#000' ctx.fillRect(0, 0, 800, 600) // Draw something ctx.fillStyle = '#0f0' ctx.fillRect( Math.random() * 700, Math.random() * 500, 100, 100 ) // Request next frame // Note: requestAnimationFrame is available in dedicated workers only requestAnimationFrame(animate)}
One common use for OffscreenCanvas is image processing:
Copy
Ask AI
// main.jsconst worker = new Worker('image-worker.js', { type: 'module' })async function processImage(file) { const bitmap = await createImageBitmap(file) worker.postMessage({ bitmap, filter: 'grayscale' }, [bitmap]) // Transfer the bitmap}worker.onmessage = (event) => { const processedBitmap = event.data.bitmap // Draw the result on a visible canvas const canvas = document.getElementById('result') const ctx = canvas.getContext('2d') ctx.drawImage(processedBitmap, 0, 0)}
Copy
Ask AI
// image-worker.jsself.onmessage = async (event) => { const { bitmap, filter } = event.data // Create an OffscreenCanvas matching the image size const canvas = new OffscreenCanvas(bitmap.width, bitmap.height) const ctx = canvas.getContext('2d') // Draw the image ctx.drawImage(bitmap, 0, 0) // Get pixel data const imageData = ctx.getImageData(0, 0, canvas.width, canvas.height) const data = imageData.data // Apply grayscale filter if (filter === 'grayscale') { for (let i = 0; i < data.length; i += 4) { const avg = (data[i] + data[i + 1] + data[i + 2]) / 3 data[i] = avg // R data[i + 1] = avg // G data[i + 2] = avg // B // Alpha unchanged } } // Put processed data back ctx.putImageData(imageData, 0, 0) // Convert to bitmap and send back const resultBitmap = await createImageBitmap(canvas) self.postMessage({ bitmap: resultBitmap }, [resultBitmap])}
OffscreenCanvas is great for games, data visualizations, and image/video processing. Anything that involves heavy canvas work can benefit from being moved to a worker.
Workers cannot access the DOM. They can’t read or modify HTML elements:
Copy
Ask AI
// worker.js// ❌ All of these will faildocument.getElementById('app') // document is undefinedwindow.location // window is undefineddocument.createElement('div') // Can't create elementselement.addEventListener('click', fn) // Can't add event listeners
If you need to update the DOM based on worker results, send the data back to the main thread:
Copy
Ask AI
// worker.jsconst result = heavyCalculation()self.postMessage({ result }) // Send data to main thread
Copy
Ask AI
// main.jsworker.onmessage = (event) => { // Update DOM on the main thread document.getElementById('result').textContent = event.data.result}
The most common mistake. It fails silently or throws cryptic errors:
Copy
Ask AI
// worker.js// ❌ WRONG - This won't workself.onmessage = (event) => { const result = calculate(event.data) document.getElementById('output').textContent = result // ERROR!}// ✓ CORRECT - Send data back to main threadself.onmessage = (event) => { const result = calculate(event.data) self.postMessage(result) // Main thread updates the DOM}
Workers have overhead. Creating them, posting messages, and cloning data all take time:
Copy
Ask AI
// ❌ WRONG - Worker overhead exceeds computation timeconst worker = new Worker('worker.js')worker.postMessage([1, 2, 3]) // Adding 3 numbers doesn't need a worker// ✓ CORRECT - Just do it on the main threadconst sum = [1, 2, 3].reduce((a, b) => a + b, 0)
Rule of thumb: Only use workers for tasks that take more than 50-100ms. For quick operations, the overhead isn’t worth it.
// main.js - Using the poolimport { WorkerPool } from './WorkerPool.js'const pool = new WorkerPool('compute-worker.js', 4)// Process many items in parallelasync function processItems(items) { const results = await Promise.all( items.map(item => pool.runTask(item)) ) return results}// Example: process 100 items using 4 workersconst items = Array.from({ length: 100 }, (_, i) => ({ id: i, data: Math.random() }))const results = await processItems(items)console.log(results)// Clean up when donepool.terminate()
Copy
Ask AI
// compute-worker.jsself.onmessage = (event) => { const { id, data } = event.data // Simulate heavy computation let result = data for (let i = 0; i < 1000000; i++) { result = Math.sin(result) * Math.cos(result) } self.postMessage({ id, result })}
navigator.hardwareConcurrency returns the number of logical CPU cores. Using this as your pool size lets you maximize parallelism without oversubscribing.
Sometimes you want a worker without a separate file. You can create workers from strings using Blob URLs:
Copy
Ask AI
// Create a worker from a string (no separate file needed!)function createWorkerFromString(code) { const blob = new Blob([code], { type: 'application/javascript' }) const url = URL.createObjectURL(blob) const worker = new Worker(url) // The URL can be revoked immediately after the worker is created. // The browser keeps the blob data until the worker finishes loading. URL.revokeObjectURL(url) return worker}// Usageconst workerCode = ` self.onmessage = (event) => { const numbers = event.data const sum = numbers.reduce((a, b) => a + b, 0) self.postMessage(sum) }`const worker = createWorkerFromString(workerCode)worker.postMessage([1, 2, 3, 4, 5])worker.onmessage = (e) => console.log('Sum:', e.data) // Sum: 15
Web Workers provide true parallelism — Unlike async/await (which is concurrent but single-threaded), workers run on separate CPU threads simultaneously.
Use workers for CPU-bound tasks — Async is for waiting (network, timers). Workers are for computing (heavy calculations, data processing).
Workers communicate via postMessage — Data is copied by default using the structured clone algorithm. Workers can’t directly access main thread variables.
Workers can’t touch the DOM — No document, no window, no localStorage. If you need to update the UI, send data back to the main thread.
Transfer large data instead of copying — For big ArrayBuffers, use postMessage(data, [data]) to transfer ownership. The transfer is nearly instant.
Module workers are the modern approach — Use new Worker('file.js', { type: 'module' }) to enable import/export syntax and modern features.
Three types of workers exist — Dedicated (one owner), Shared (multiple tabs), and Service Workers (network proxy). Use Dedicated for most cases.
Always terminate workers when done — Call worker.terminate() or they’ll keep running and consuming resources.
Don’t overuse workers for small tasks — Worker creation and message passing have overhead. Only use them for tasks taking 50ms+.
Worker pools improve performance — Reuse workers instead of creating new ones for repeated tasks. Match pool size to CPU cores.
Question 1: What's the difference between async/await and Web Workers?
Answer:Async/await provides concurrency on a single thread. When you await, JavaScript pauses that function and runs other code, but everything still runs on one thread, taking turns.Web Workers provide parallelism on multiple threads. A worker runs on a completely separate thread, executing simultaneously with the main thread.
Copy
Ask AI
// Async: Takes turns on one threadasync function fetchData() { await fetch('/api') // Pauses here, other code can run}// Workers: Actually runs at the same timeconst worker = new Worker('heavy-task.js')worker.postMessage(data) // Worker computes in parallel// Main thread continues immediately
Use async for I/O-bound tasks (network, files). Use workers for CPU-bound tasks (calculations, processing).
Question 2: Why can't workers access the DOM?
Answer:The DOM is not thread-safe. If multiple threads could modify the DOM simultaneously, you’d get race conditions and corrupted state. Browsers would need complex locking mechanisms.Instead, browsers made a design choice: only the main thread can touch the DOM. Workers do computation and send results back:
Copy
Ask AI
// worker.js// ❌ Can't do thisdocument.getElementById('result').textContent = 'Done'// ✓ Send data back insteadself.postMessage({ result: 'Done' })// main.jsworker.onmessage = (e) => { document.getElementById('result').textContent = e.data.result}
This constraint keeps things simple and bug-free.
Question 3: When should you use transferable objects?
Answer:Use transferable objects when:
You’re sending large data (> 1MB)
You don’t need to keep the data in the sending context
Copy
Ask AI
// Large buffer (100MB)const buffer = new ArrayBuffer(100 * 1024 * 1024)// ❌ SLOW: Copies 100MBworker.postMessage(buffer)// ✓ FAST: Transfers ownership instantlyworker.postMessage(buffer, [buffer])// buffer is now empty (byteLength = 0)
Transferable objects include: ArrayBuffer, MessagePort, ImageBitmap, OffscreenCanvas, and various streams.
Question 4: What's the difference between Dedicated and Shared Workers?
Answer:Dedicated Workers belong to a single script. Only that script can communicate with them.
Copy
Ask AI
const worker = new Worker('worker.js') // Only this script uses it
Shared Workers can be accessed by multiple scripts, even across different tabs of the same origin.
Copy
Ask AI
// Tab 1 and Tab 2 both connect to the same workerconst worker = new SharedWorker('shared.js')worker.port.postMessage('hello')
Use Shared Workers for:
Shared state across tabs
Single WebSocket connection for multiple tabs
Reducing memory by sharing one worker instance
Note: Shared Workers have limited browser support (not in Safari).
Question 5: How do you create a worker without a separate file?
Answer:Use a Blob URL to create a worker from a string:
This is useful for simple tasks or demos, but has limitations: no imports, no closures, harder to debug.
Question 6: What happens if you forget to terminate a worker?
Answer:The worker keeps running and consuming resources (memory, CPU time). If you create workers in a loop or on repeated events without terminating them, you’ll leak resources:
Copy
Ask AI
// ❌ Memory leak: creates new worker every clickbutton.onclick = () => { const worker = new Worker('task.js') worker.postMessage(data) worker.onmessage = (e) => showResult(e.data) // Worker never terminated!}// ✓ Fixed: terminate after usebutton.onclick = () => { const worker = new Worker('task.js') worker.postMessage(data) worker.onmessage = (e) => { showResult(e.data) worker.terminate() // Clean up }}// ✓ Better: reuse one workerconst worker = new Worker('task.js')worker.onmessage = (e) => showResult(e.data)button.onclick = () => { worker.postMessage(data) // Reuse}
Web Workers are a browser API that lets you run JavaScript in background threads, separate from the main UI thread. As defined in the WHATWG HTML Living Standard, workers execute in an isolated global context with no access to the DOM. They communicate with the main thread through postMessage() and are supported in all modern browsers.
What is the difference between Web Workers and async/await?
Async/await provides concurrency on a single thread — it helps with waiting for I/O but cannot speed up CPU-intensive computation. Web Workers provide true parallelism by running code on separate OS threads. Use async/await for network requests and timers; use Web Workers for heavy calculations, image processing, or data parsing that would otherwise freeze the UI.
Can Web Workers access the DOM?
No. Web Workers run in an isolated context and cannot access document, window, or any DOM APIs. This is by design — the DOM is not thread-safe, so allowing concurrent access would cause race conditions. Workers communicate results back to the main thread via postMessage(), and the main thread updates the DOM.
What is the difference between Dedicated Workers, Shared Workers, and Service Workers?
Dedicated Workers serve a single page and are the most common type. Shared Workers can be accessed by multiple pages from the same origin. Service Workers act as network proxies that enable offline support and push notifications. According to Can I Use data, Dedicated Workers have over 98% browser support, while Shared Workers have more limited support.
What are transferable objects in Web Workers?
Transferable objects (like ArrayBuffer, MessagePort, and OffscreenCanvas) can be moved between threads without copying. Normal postMessage() data is cloned using the structured clone algorithm, which is slow for large data. Transferring ownership is nearly instant regardless of size, but the sending context loses access to the transferred object.