JavaScript is single-threaded
JavaScript has exactly one thread of execution. At any given moment, only one piece of code runs. There is no parallel execution within a single JavaScript environment — no two functions run at the same time, no shared memory between concurrent threads, no race conditions in the traditional sense.
This sounds like a limitation, but for the browser's primary use case — responding to user input and updating the UI — it is a deliberate design decision. A single thread means the DOM is never touched by two pieces of code simultaneously, which eliminates an entire class of concurrency bugs.
The question is: how does JavaScript handle operations that take a long time, like network requests or file reads, without blocking the thread and freezing the UI? The answer is the event loop — a coordination mechanism between the JavaScript engine, the browser runtime, and a set of queues that schedule work.
To understand the event loop, you need to understand four things: the call stack, Web APIs, the task queue, and the microtask queue. Each plays a distinct role in the overall system.
The call stack
The call stack is where JavaScript tracks which function is currently running and which functions are waiting for their callees to return. It follows LIFO (Last In, First Out) ordering — the most recently called function is always at the top and runs to completion before the function below it can continue.
function third() {
console.log('third');
}
function second() {
third();
console.log('second');
}
function first() {
second();
console.log('first');
}
first();
When first() is called, the stack has one frame. When first() calls second(), a second frame is pushed on top. When second() calls third(), a third frame is pushed. third() logs "third" and returns — its frame is popped. second() resumes, logs "second", returns — its frame is popped. first() resumes, logs "first", returns — the stack is empty again.
The critical constraint is run-to-completion: once a function starts executing, it cannot be paused by external events. No browser event, no timer, no network response can interrupt a synchronous function mid-execution. Everything waits for the call stack to become empty.
This is why long-running synchronous code blocks the page. If you write a loop that takes two seconds to complete, the browser cannot respond to any click, scroll, or keypress during those two seconds:
// This blocks the thread for ~2 seconds
// The page is frozen for the entire duration
function blockFor2Seconds() {
const start = Date.now();
while (Date.now() - start < 2000) {
// spin-wait — never do this
}
}
blockFor2Seconds();
console.log('Done'); // runs after 2 seconds
The right model is to hand off slow work to the browser's built-in APIs, which operate outside the JavaScript thread, and receive the result as a callback scheduled through the event loop.
Web APIs and the browser runtime
The browser provides a set of APIs that perform work outside the JavaScript thread. setTimeout, fetch, addEventListener, XMLHttpRequest, the Geolocation API — none of these are part of the JavaScript specification. They are provided by the browser runtime (or Node.js runtime on the server) and run in separate threads managed by the host environment.
When you call setTimeout(callback, 1000), here is what actually happens:
- JavaScript calls
setTimeout— this is a Web API, not a JS function. - The browser registers a timer in its own timer subsystem (separate thread).
setTimeoutreturns immediately. The call stack continues with the next line.- After 1000ms, the browser's timer subsystem pushes the callback into the task queue.
- When the call stack is empty, the event loop picks the callback from the task queue and executes it.
console.log('1 - synchronous');
setTimeout(() => {
console.log('3 - setTimeout callback (task queue)');
}, 0);
console.log('2 - synchronous');
Even with a 0ms delay, the setTimeout callback runs last — after all synchronous code on the call stack has finished. The callback was handed off to the browser, which placed it in the task queue. The event loop only checks the task queue once the call stack is empty.
The task queue (macrotasks)
The task queue (also called the macrotask queue) holds callbacks that are ready to run but waiting for the call stack to clear. The event loop's job is to continuously check: "Is the call stack empty? If so, is there anything in the task queue?" If both are true, it dequeues one task and pushes it onto the call stack.
Sources that schedule macrotasks include:
setTimeoutandsetIntervalcallbacks- I/O events (file system reads in Node.js, XHR callbacks)
- DOM events (click, keydown, mousemove)
setImmediatein Node.js- UI rendering (in some models, treated as a task)
A key rule: the event loop picks exactly one macrotask per iteration. After running it, it checks the microtask queue (more on that next), potentially renders, and only then picks the next macrotask.
// Each setTimeout schedules a separate macrotask
setTimeout(() => console.log('task 1'), 0);
setTimeout(() => console.log('task 2'), 0);
setTimeout(() => console.log('task 3'), 0);
console.log('sync');
All three setTimeout callbacks are queued as separate macrotasks. The synchronous code runs first (call stack), then the event loop picks them up one per iteration in the order they were queued.
The microtask queue
The microtask queue is a separate, higher-priority queue that sits between each macrotask. After every macrotask completes — and before the event loop picks the next macrotask or runs a rendering step — it drains the microtask queue completely.
This means: if a microtask schedules another microtask, that second microtask also runs before any macrotask. The queue is not drained one item at a time between macrotasks — it is run to full completion.
Sources that schedule microtasks:
Promise.then(),Promise.catch(),Promise.finally()callbacksqueueMicrotask(callback)MutationObservercallbacksprocess.nextTick()in Node.js (runs even before Promise callbacks)
console.log('1 - sync start');
setTimeout(() => console.log('4 - macrotask (setTimeout)'), 0);
Promise.resolve()
.then(() => console.log('3 - microtask (Promise.then)'));
console.log('2 - sync end');
The Promise callback (microtask) runs before the setTimeout callback (macrotask), even though both were registered at approximately the same time. The order is: synchronous code → microtask queue → macrotask queue.
Here is a more complex example that makes the priority crystal clear:
setTimeout(() => {
console.log('macro 1');
Promise.resolve().then(() => console.log('micro inside macro 1'));
}, 0);
setTimeout(() => console.log('macro 2'), 0);
After "macro 1" runs, the event loop checks the microtask queue before moving to "macro 2". It finds the Promise callback that was scheduled inside the first setTimeout, runs it ("micro inside macro 1"), then picks up "macro 2".
Full execution order: step by step
Now let's trace a more complete example, walking through every step explicitly. Read the comments as a running log of what the engine is doing:
console.log('A'); // 1. sync — runs immediately
setTimeout(() => {
console.log('B'); // 5. macrotask
Promise.resolve().then(() => {
console.log('C'); // 6. microtask (queued after B)
});
}, 0);
Promise.resolve().then(() => {
console.log('D'); // 3. microtask
queueMicrotask(() => {
console.log('E'); // 4. microtask (queued by D)
});
});
console.log('F'); // 2. sync — runs immediately
Here is what happened, event loop iteration by iteration:
- Call stack runs synchronously: "A" is logged. setTimeout is registered with the browser. A Promise is created and its
.thencallback ("D") is queued as a microtask. "F" is logged. The call stack is now empty. - Microtask queue is drained: "D" runs. Inside D,
queueMicrotaskadds "E" to the microtask queue. "D" finishes. The queue is not empty — "E" was added. "E" runs. Queue is now empty. - Event loop picks the next macrotask: The setTimeout callback runs. "B" is logged. A new Promise schedules "C" as a microtask.
- Microtask queue is drained again: "C" runs. Queue is empty.
- Event loop checks for more macrotasks: None remain. Done.
async/await and the event loop
The async/await syntax introduced in ES2017 is not a new concurrency mechanism — it is syntactic sugar over Promises that makes asynchronous code read like synchronous code. Understanding how it maps to Promises and microtasks is essential for predicting execution order.
An async function runs synchronously until it hits the first await. At that point, it suspends execution and returns a pending Promise to its caller. When the awaited value resolves, the rest of the function is scheduled as a microtask to resume.
async function fetchData() {
console.log('B - inside fetchData, before await');
const result = await Promise.resolve('data');
console.log('D - inside fetchData, after await');
return result;
}
console.log('A - before call');
fetchData();
console.log('C - after call');
The function runs synchronously through "B", then suspends at await. Control returns to the caller, which logs "C". Then the microtask queue runs and "D" executes.
Comparing async/await with raw Promises
These two pieces of code are equivalent. The async/await version is the same underlying mechanism with cleaner syntax:
function getData() {
return fetch('/api/user')
.then(res => res.json())
.then(data => {
console.log(data.name);
return data;
})
.catch(err => {
console.error('Failed:', err);
});
}
async function getData() {
try {
const res = await fetch('/api/user');
const data = await res.json();
console.log(data.name);
return data;
} catch (err) {
console.error('Failed:', err);
}
}
Multiple awaits in sequence
Each await in an async function introduces a suspension point. The code between awaits runs synchronously, but each resumption after an await is a microtask scheduled when the previous awaited Promise resolves:
async function steps() {
console.log('step 1');
await Promise.resolve();
console.log('step 2');
await Promise.resolve();
console.log('step 3');
}
steps();
Promise.resolve().then(() => console.log('external microtask'));
"step 1" runs synchronously. At the first await, the function suspends. The external Promise.then microtask runs first because it was registered before the async function's resumption. Then "step 2" runs (scheduled by the resolved await). "step 3" runs after the second await resolves.
The rendering step
In browsers, the event loop also coordinates with the rendering pipeline. Between each macrotask (after the microtask queue is drained), the browser has an opportunity to run style calculations, layout, paint, and compositing. This rendering step only runs if the browser decides a visual update is needed — it is not guaranteed to happen after every task.
The rendering step follows this order within each event loop iteration:
- Pick one macrotask from the task queue and run it.
- Drain the microtask queue completely.
- If a rendering update is needed: run
requestAnimationFramecallbacks, then run style, layout, and paint. - Go to step 1.
This has an important implication: DOM mutations made inside a macrotask are not painted until the rendering step, which comes after all microtasks from that macrotask have run. If you mutate the DOM in a Promise callback (microtask), those changes are batched and painted together — you never see intermediate states.
// requestAnimationFrame runs before paint, after microtasks
function animate(timestamp) {
const progress = (timestamp % 2000) / 2000;
element.style.transform = `translateX(${progress * 300}px)`;
requestAnimationFrame(animate);
}
requestAnimationFrame(animate);
// This pattern is correct: rAF syncs with the display refresh rate (60fps)
// setTimeout-based animations are less accurate and can cause jank
requestAnimationFrame for all visual animations. It runs at the display refresh rate (typically 60fps), syncs with the rendering step, and automatically pauses when the tab is hidden to save battery. setTimeout-based animations are less precise and continue running even in background tabs.The Node.js event loop
Node.js uses the same fundamental event loop concept but adds more structure through six distinct phases, implemented using the libuv library. Each phase has its own queue of callbacks:
- timers — executes
setTimeoutandsetIntervalcallbacks whose delay has expired. - pending callbacks — I/O callbacks deferred from the previous iteration (e.g., TCP errors).
- idle/prepare — internal Node.js use only.
- poll — retrieves new I/O events and runs their callbacks. If the poll queue is empty, waits for timers or I/O.
- check — runs
setImmediatecallbacks. - close callbacks — socket close events, cleanup.
Between every phase transition, Node.js drains process.nextTick() callbacks first, then Promise microtasks. process.nextTick() is not strictly part of the event loop — it is a special queue that runs before any I/O phase begins:
console.log('1 - sync');
process.nextTick(() => console.log('3 - nextTick'));
Promise.resolve().then(() => console.log('4 - Promise microtask'));
setImmediate(() => console.log('6 - setImmediate (check phase)'));
setTimeout(() => console.log('5 - setTimeout (timers phase)'), 0);
console.log('2 - sync');
setTimeout(fn, 0) vs setImmediate in Node.js is not guaranteed when called from the main script. The output above is the most common but depends on how long the timer takes to expire relative to the event loop's startup time. Inside an I/O callback, setImmediate always runs before setTimeout.setImmediate vs setTimeout in I/O context
const fs = require('fs');
// Inside an I/O callback, setImmediate ALWAYS runs before setTimeout
fs.readFile(__filename, () => {
setTimeout(() => console.log('timeout'), 0);
setImmediate(() => console.log('immediate'));
});
After an I/O callback runs (poll phase), the event loop moves to the check phase next — where setImmediate lives — before circling back to the timers phase for setTimeout. Use setImmediate when you want a callback to run after the current I/O events but before any timers.
Practical production patterns
Running async operations in parallel with Promise.all
Awaiting Promises one after another in a for loop is sequential — each request waits for the previous one to complete. Promise.all launches all operations concurrently and waits for all of them:
const userIds = [1, 2, 3, 4, 5];
// Sequential: total time = sum of individual request times
async function fetchSequential(ids) {
const users = [];
for (const id of ids) {
const user = await fetchUser(id); // waits for each one
users.push(user);
}
return users;
}
// Parallel: total time = slowest individual request
async function fetchParallel(ids) {
return Promise.all(ids.map(id => fetchUser(id)));
}
// If one rejection should not cancel the rest, use allSettled
async function fetchBestEffort(ids) {
const results = await Promise.allSettled(ids.map(id => fetchUser(id)));
return results
.filter(r => r.status === 'fulfilled')
.map(r => r.value);
}
Batching microtasks with queueMicrotask
// Batch multiple DOM writes into a single render cycle
let pendingUpdates = [];
function scheduleUpdate(update) {
pendingUpdates.push(update);
if (pendingUpdates.length === 1) {
// Only schedule the flush once, even if called many times
queueMicrotask(flushUpdates);
}
}
function flushUpdates() {
const batch = pendingUpdates.splice(0);
for (const update of batch) {
update();
}
}
// All three updates will be batched and run in one microtask
scheduleUpdate(() => { element1.textContent = 'updated'; });
scheduleUpdate(() => { element2.style.color = 'red'; });
scheduleUpdate(() => { element3.classList.add('active'); });
Converting callback APIs to Promises
// Promisify a classic callback-style function
function readFileAsync(path) {
return new Promise((resolve, reject) => {
fs.readFile(path, 'utf8', (err, data) => {
if (err) reject(err);
else resolve(data);
});
});
}
// Node.js 12+ has util.promisify built-in
const { promisify } = require('util');
const readFile = promisify(fs.readFile);
// Node.js 14+ fs/promises module — no promisify needed
const { readFile: readFileNative } = require('fs/promises');
Common pitfalls
Forgetting to await
async function saveUser(user) {
db.insert(user); // returns a Promise — but no await!
console.log('saved'); // runs before insert completes
}
// Fix: await the operation
async function saveUserFixed(user) {
await db.insert(user);
console.log('saved');
}
Unhandled Promise rejections
// This rejection is silently swallowed in older environments
async function danger() {
throw new Error('oops');
}
danger(); // Promise returned but not awaited — rejection ignored
// Fix: always handle rejections
danger().catch(err => console.error(err));
// Or: use a global handler as a safety net
process.on('unhandledRejection', (reason) => {
console.error('Unhandled rejection:', reason);
process.exit(1);
});
Infinite microtask loops
// This freezes the page — microtask queue never empties
function infinite() {
Promise.resolve().then(infinite);
}
infinite();
// Fix: use setTimeout to yield to the macrotask queue
function yieldingLoop() {
doSomeWork();
setTimeout(yieldingLoop, 0); // hands control back each iteration
}
Using await inside loops incorrectly
const urls = ['/a', '/b', '/c'];
// Wrong in forEach — forEach does not await the callback
urls.forEach(async (url) => {
const data = await fetch(url);
console.log(data);
});
// All three fetches start, but there is no way to know when they all finish
// Correct option 1: for...of (sequential)
for (const url of urls) {
const data = await fetch(url);
console.log(data);
}
// Correct option 2: Promise.all (parallel)
await Promise.all(urls.map(async (url) => {
const data = await fetch(url);
console.log(data);
}));
The rule is simple: forEach, map, filter, and reduce are synchronous and do not await async callbacks. Use for...of for sequential async iteration, or wrap in Promise.all for parallel execution where you need to know when all operations are complete.