Promise.all, Promise.allSettled, Promise.race, AbortController, Web Workers, and the Scheduler API — the concurrency toolkit that sits between “one await at a time” and “this is too complex.” Your apps are slower than they should be.
async/await made asynchronous JavaScript readable. It also made it easy to write sequential code when parallel code would be dramatically faster. Most JavaScript developers write await before every async operation, wait for it to complete, then move on — and never notice the seconds they’re leaving on the table.
This post covers the concurrency primitives that sit between “one await at a time” and “full concurrency library” — the patterns that are already in the language, already in the browser, and would make your applications measurably faster if you used them.
The Sequential Trap
Before anything else — the most common concurrency mistake, in code every developer has written:
// ✗ Sequential — each awaits the previous one
// Total time: 200ms + 300ms + 150ms = 650ms
async function loadDashboard(userId) {
const user = await fetchUser(userId) // 200ms
const orders = await fetchOrders(userId) // 300ms — waits for user
const analytics = await fetchAnalytics(userId) // 150ms — waits for orders
return { user, orders, analytics }
}
None of these three requests depend on each other. They’re waiting for no reason. The user spends 650ms staring at a spinner when they could be looking at a 300ms spinner.
// ✓ Parallel — all three fire simultaneously
// Total time: max(200ms, 300ms, 150ms) = 300ms
async function loadDashboard(userId) {
const [user, orders, analytics] = await Promise.all([
fetchUser(userId),
fetchOrders(userId),
fetchAnalytics(userId),
])
return { user, orders, analytics }
}
Same code. Same requests. 54% faster.
Promise.all: Parallel Execution with a Single Failure Mode
Promise.all takes an array of promises and returns a single promise that resolves with an array of results when all input promises resolve. If any promise rejects, the whole Promise.all rejects immediately.
// Basic usage
const [user, posts, comments] = await Promise.all([
fetchUser(id),
fetchPosts(id),
fetchComments(id),
])
// With typed results (TypeScript)
const [profile, settings]: [UserProfile, UserSettings] = await Promise.all([
api.get<UserProfile>('/profile'),
api.get<UserSettings>('/settings'),
])
When Promise.all Is the Wrong Tool
The fail-fast behaviour is a feature when requests are interdependent. It’s a problem when they’re independent and you want partial results.
// Problem: if fetchNotifications rejects, you lose user AND posts too
const [user, posts, notifications] = await Promise.all([
fetchUser(id), // succeeds
fetchPosts(id), // succeeds
fetchNotifications(), // network error — kills everything
])
// The user sees nothing instead of most of the page
Promise.allSettled: Parallel Execution with Partial Results
Promise.allSettled waits for all promises to complete regardless of success or failure. It never rejects. Each result is an object with a status of 'fulfilled' or 'rejected'.
const results = await Promise.allSettled([
fetchUser(id),
fetchPosts(id),
fetchNotifications(), // allowed to fail
])
for (const result of results) {
if (result.status === 'fulfilled') {
console.log('Success:', result.value)
} else {
console.error('Failed:', result.reason)
}
}
A Practical Pattern: Safe Parallel Loading
async function loadPageData(userId) {
const [userResult, postsResult, notificationsResult] = await Promise.allSettled([
fetchUser(userId),
fetchPosts(userId),
fetchNotifications(userId),
])
return {
user: userResult.status === 'fulfilled' ? userResult.value : null,
posts: postsResult.status === 'fulfilled' ? postsResult.value : [],
notifications: notificationsResult.status === 'fulfilled' ? notificationsResult.value : [],
errors: [userResult, postsResult, notificationsResult]
.filter(r => r.status === 'rejected')
.map(r => r.reason),
}
}
A Reusable settled() Helper
type SettledResult<T> =
| { ok: true; value: T }
| { ok: false; error: unknown }
async function settled<T>(promise: Promise<T>): Promise<SettledResult<T>> {
try {
return { ok: true, value: await promise }
} catch (error) {
return { ok: false, error }
}
}
// Usage
const [user, posts] = await Promise.all([
settled(fetchUser(id)),
settled(fetchPosts(id)),
])
if (user.ok) renderUser(user.value)
if (!posts.ok) showError('Posts failed to load', posts.error)
Promise.race: First Response Wins
Promise.race resolves or rejects with the first promise to settle, discarding all others. The canonical use case is implementing request timeouts.
// Timeout any async operation
function withTimeout<T>(promise: Promise<T>, ms: number): Promise<T> {
const timeout = new Promise<never>((_, reject) =>
setTimeout(() => reject(new Error(`Timed out after ${ms}ms`)), ms)
)
return Promise.race([promise, timeout])
}
// Usage
try {
const data = await withTimeout(fetchHeavyReport(), 5000)
} catch (err) {
if (err.message.startsWith('Timed out')) showTimeoutMessage()
}
Fallback to Cache on Slow Network
async function fetchWithCacheFallback(url) {
const networkRequest = fetch(url).then(r => r.json())
try {
return await Promise.race([
networkRequest,
new Promise((_, reject) => setTimeout(() => reject(new Error('slow')), 2000)),
])
} catch {
// Network was slow — use cache while network catches up
networkRequest.then(data => setCachedValue(url, data))
return getCachedValue(url)
}
}
Promise.any: First Success Wins
Promise.any resolves with the first promise that succeeds, ignoring rejections. It only rejects if all promises reject (throwing an AggregateError). This is different from Promise.race — a rejection doesn’t win.
// Try multiple CDN endpoints — use the first that responds successfully
async function fetchFromAnyCDN(path) {
return Promise.any([
fetch(`https://cdn-us.example.com${path}`),
fetch(`https://cdn-eu.example.com${path}`),
fetch(`https://cdn-ap.example.com${path}`),
])
}
// Feature detection — use the first available storage API
const storage = await Promise.any([
openIndexedDB('app-cache'),
openCacheStorage('app-cache'),
fallbackToMemoryCache(),
])
AbortController: Cancelling Async Operations
AbortController is the native browser API for cancelling fetch requests and other async operations. Without it, you can’t stop a pending request when the user navigates away, searches for something new, or closes a modal — and stale responses can set state on components that have already unmounted.
Basic Cancellation
const controller = new AbortController()
const timeout = setTimeout(() => controller.abort(), 10000)
try {
const response = await fetch('/api/data', { signal: controller.signal })
clearTimeout(timeout)
return await response.json()
} catch (err) {
if (err.name === 'AbortError') console.log('Request was cancelled')
else throw err
}
Cancelling on Component Unmount (React)
function useUser(userId) {
const [user, setUser] = useState(null)
useEffect(() => {
const controller = new AbortController()
async function load() {
try {
const res = await fetch(`/api/users/${userId}`, { signal: controller.signal })
const data = await res.json()
setUser(data)
} catch (err) {
if (err.name !== 'AbortError') throw err
}
}
load()
return () => controller.abort() // cleanup on unmount or userId change
}, [userId])
return user
}
Cancelling on Component Unmount (Vue)
import { ref, watchEffect } from 'vue'
function useUser(userId) {
const user = ref(null)
watchEffect(async (onCleanup) => {
const controller = new AbortController()
onCleanup(() => controller.abort())
try {
const res = await fetch(`/api/users/${userId.value}`, { signal: controller.signal })
user.value = await res.json()
} catch (err) {
if (err.name !== 'AbortError') throw err
}
})
return user
}
Aborting Multiple Requests at Once
class RequestManager {
private controllers = new Map<string, AbortController>()
async fetch(key: string, url: string, options = {}) {
this.controllers.get(key)?.abort() // cancel in-flight request with same key
const controller = new AbortController()
this.controllers.set(key, controller)
try {
const res = await fetch(url, { ...options, signal: controller.signal })
this.controllers.delete(key)
return res
} catch (err) {
this.controllers.delete(key)
throw err
}
}
abortAll() {
this.controllers.forEach(c => c.abort())
this.controllers.clear()
}
}
const manager = new RequestManager()
// Only the latest search survives — previous ones are cancelled
const results = await manager.fetch('search', `/api/search?q=${query}`)
Concurrency Limiting: Not Too Parallel
Running 50 requests simultaneously isn’t always better. It can overwhelm your server, trigger rate limits, or crash the browser tab. A concurrency limiter runs a fixed number of operations at a time.
async function withConcurrencyLimit<T>(
tasks: (() => Promise<T>)[],
concurrency: number
): Promise<T[]> {
const results: T[] = new Array(tasks.length)
let nextIndex = 0
async function worker() {
while (nextIndex < tasks.length) {
const index = nextIndex++
results[index] = await tasks[index]()
}
}
await Promise.all(
Array.from({ length: Math.min(concurrency, tasks.length) }, worker)
)
return results
}
// Process 1000 images — 5 at a time
const processed = await withConcurrencyLimit(
imageUrls.map(url => () => processImage(url)),
5
)
Batch Upload with Progress
async function uploadFiles(files: File[], onProgress: (pct: number) => void) {
let completed = 0
const tasks = files.map(file => async () => {
const result = await uploadFile(file)
onProgress(++completed / files.length * 100)
return result
})
return withConcurrencyLimit(tasks, 3) // 3 uploads at a time
}
Web Workers: True Parallelism for CPU-Bound Work
JavaScript is single-threaded. async/await, Promise.all, and every pattern above still run on the main thread — they’re managing callback order, not executing in parallel. For CPU-intensive work (image processing, PDF generation, encryption, large data transformations), the main thread blocks and the UI freezes.
Web Workers run in a separate background thread, communicating via messages.
Basic Worker Pattern
// worker.js — runs in a background thread
self.onmessage = function(event) {
const { data, operation } = event.data
let result
if (operation === 'sort') result = [...data].sort((a, b) => a - b)
if (operation === 'filter') result = data.filter(n => n % 2 === 0)
if (operation === 'sum') result = data.reduce((acc, n) => acc + n, 0)
self.postMessage({ result })
}
// main.js — keep the UI responsive
const worker = new Worker('/worker.js')
function runInWorker(operation, data) {
return new Promise((resolve, reject) => {
worker.onmessage = (e) => resolve(e.data.result)
worker.onerror = (e) => reject(new Error(e.message))
worker.postMessage({ operation, data })
})
}
// Runs in the background — UI stays responsive
const sorted = await runInWorker('sort', largeArray)
A TypeScript Worker Wrapper
interface WorkerMessage { id: string; operation: string; payload: unknown }
interface WorkerResponse { id: string; result?: unknown; error?: string }
class TypedWorker {
private worker = new Worker(url)
private pending = new Map<string, { resolve: Function; reject: Function }>()
constructor(url: string) {
this.worker.onmessage = (e: MessageEvent<WorkerResponse>) => {
const { id, result, error } = e.data
const cb = this.pending.get(id)
if (!cb) return
this.pending.delete(id)
error ? cb.reject(new Error(error)) : cb.resolve(result)
}
}
run<T>(operation: string, payload: unknown): Promise<T> {
const id = crypto.randomUUID()
return new Promise<T>((resolve, reject) => {
this.pending.set(id, { resolve, reject })
this.worker.postMessage({ id, operation, payload } satisfies WorkerMessage)
})
}
terminate() { this.worker.terminate() }
}
const worker = new TypedWorker('/compute-worker.js')
const result = await worker.run<number[]>('sort', largeDataset)
Using Comlink: The Ergonomic Worker Library
npm install comlink
// heavy-worker.js
import { expose } from 'comlink'
expose({
async processImage(imageData) { return applyFilters(imageData) },
async generatePDF(content) { return buildPDF(content) },
async parseCSV(csvString) { return parseAndValidate(csvString) },
})
// main.js
import { wrap } from 'comlink'
const workerApi = wrap(new Worker('/heavy-worker.js', { type: 'module' }))
// Looks like a regular async function — runs in background thread
const image = await workerApi.processImage(rawImageData)
const pdf = await workerApi.generatePDF(reportContent)
Transferable Objects: Zero-Copy Data Transfer
Data passed to a Worker is normally copied — expensive for large ArrayBuffers. Transferable objects are moved, not copied. The original becomes unusable, but the transfer is instant.
const buffer = new ArrayBuffer(100 * 1024 * 1024) // 100MB
// Without transfer: 100MB copy — slow
worker.postMessage({ buffer })
// With transfer: zero-copy — instant
worker.postMessage({ buffer }, [buffer])
// buffer in the main thread is now empty — it lives in the worker
The Scheduler API: Prioritising Work on the Main Thread
The Scheduler API gives you fine-grained control over when tasks run on the main thread. Three priority levels replace setTimeout(fn, 0) hacks.
// user-blocking: runs ASAP — for critical UI updates
await scheduler.postTask(() => updateCriticalUI(), { priority: 'user-blocking' })
// user-visible: runs when main thread is clear — for secondary content
await scheduler.postTask(() => loadSecondaryContent(), { priority: 'user-visible' })
// background: runs when browser is idle — for prefetching
await scheduler.postTask(() => prefetchNextPage(), { priority: 'background' })
Yielding to the Browser Between Expensive Tasks
async function processLargeDataset(items) {
const results = []
for (let i = 0; i < items.length; i++) {
results.push(processItem(items[i]))
// Every 100 items, yield so the browser can paint and handle input
if (i % 100 === 0) await scheduler.yield()
}
return results
}
Yield Only When User Is Interacting
async function processWithInputAwareness(items) {
const results = []
for (const item of items) {
results.push(processItem(item))
// Only interrupt when there's user input waiting
if (navigator.scheduling?.isInputPending()) {
await scheduler.yield()
}
}
return results
}
Putting It All Together: A Real Dashboard Loader
class DashboardLoader {
private controller: AbortController | null = null
private worker = new TypedWorker('/transform-worker.js')
async load(userId: string) {
this.controller?.abort()
this.controller = new AbortController()
const { signal } = this.controller
// Resilient parallel loading — partial results are OK
const [userResult, statsResult, ordersResult] = await Promise.allSettled([
fetch(`/api/users/${userId}`, { signal }).then(r => r.json()),
fetch(`/api/stats/${userId}`, { signal }).then(r => r.json()),
fetch(`/api/orders/${userId}`, { signal }).then(r => r.json()),
])
const user = userResult.status === 'fulfilled' ? userResult.value : null
const stats = statsResult.status === 'fulfilled' ? statsResult.value : null
const orders = ordersResult.status === 'fulfilled' ? ordersResult.value : []
// Heavy transformation in background — UI stays responsive
const transformed = orders.length > 0
? await this.worker.run('transformOrders', orders)
: []
// Load details with concurrency limit — don't hammer the server
const enriched = await withConcurrencyLimit(
transformed.map(order => () =>
fetch(`/api/orders/${order.id}/details`, { signal }).then(r => r.json())
),
5
)
return { user, stats, orders: enriched }
}
cancel() { this.controller?.abort() }
}
The Concurrency Decision Tree
Is the work CPU-intensive (sorting, parsing, image processing)?
├── Yes → Web Worker
└── No (I/O bound — network, storage)
│
Are the operations independent of each other?
├── No (B needs A's result) → Sequential await
└── Yes
│
Can any of them fail without breaking the page?
├── No (all required) → Promise.all
└── Yes (partial results OK) → Promise.allSettled
│
Do you need the FIRST successful result?
├── Yes → Promise.any
└── Do you have many operations (> 10)?
├── Yes → Concurrency limiter
└── No → Promise.all / Promise.allSettled
Quick Reference
| Pattern | Use when | Behaviour |
|---|---|---|
await sequentially | B depends on A | 1 at a time — total latency = sum |
Promise.all | All required, all independent | Parallel — fails if any fail |
Promise.allSettled | Partial results acceptable | Parallel — never fails, each has status |
Promise.race | First settler wins | First resolve OR reject wins |
Promise.any | First success wins | First resolve wins — all fail = AggregateError |
AbortController | Need to cancel in-flight requests | Cancels fetch, works with cleanup functions |
| Concurrency limiter | Many operations, limited throughput | N parallel at a time |
| Web Worker | CPU-intensive, must not block UI | True parallelism, separate thread |
scheduler.postTask | Control main thread task priority | user-blocking > user-visible > background |
scheduler.yield() | Long loops, keep UI responsive | Cedes control to browser then resumes |
Final Thoughts
async/await didn’t simplify concurrency — it simplified sequential asynchronous code. The moment you have two operations that don’t depend on each other, await-ing them one at a time is a choice to be slower than necessary.
Promise.all and Promise.allSettled are table stakes. AbortController is non-negotiable in any component that fetches data. Web Workers exist specifically to prevent CPU-intensive work from freezing your UI. The Scheduler API is the modern answer to jank — long-running main thread tasks that block paint and input.
None of these are exotic. They’re all standard JavaScript. They’re all available in every modern browser. They’re sitting in the language waiting to be used.
Write the sequential version first if that’s what you need. But know when it isn’t — and reach for the right tool.
