Javascript and Runtime
For a long time I thought "JavaScript" and "the JavaScript runtime" were the same thing. I'd write setTimeout, fetch, fs.readFileSync, and console.log without thinking about where those functions actually came from. They weren't part of the language. They were part of the environment the language happened to be running in. Understanding that distinction changed how I debug, how I architect, and how I reason about async code.
Here's what I've pieced together over the years.
Engine vs Runtime
The engine is the piece that parses and executes JavaScript. V8 (Chrome, Node.js, Deno), SpiderMonkey (Firefox), and JavaScriptCore (Safari, Bun) are engines. They implement the ECMAScript spec: variables, functions, closures, prototypes, promises, generators—the language itself.
The runtime is everything else. It's the engine plus the APIs, the event loop, the task queues, and the I/O layer that the host environment provides. Node.js is a runtime. The browser is a runtime. Deno and Bun are runtimes. They all embed a JavaScript engine, but they each provide different APIs on top.
This is why the same language behaves differently depending on where you run it:
// This works in Node.js
import { readFileSync } from 'node:fs'
const config = readFileSync('./config.json', 'utf-8')
console.log(JSON.parse(config))// This works in the browser
const el = document.querySelector('#app')
el.innerHTML = '<h1>Hello</h1>'Neither fs nor document is part of ECMAScript. They're APIs provided by their respective runtimes. The engine doesn't know about files or the DOM. It just executes the code and calls whatever bindings the runtime has registered.
This matters in practice. When you see ReferenceError: document is not defined in a server-side rendering context, the bug isn't in your code's logic—it's that you're calling a browser API in a runtime that doesn't have it. Once you internalize the engine/runtime split, these errors stop being mysterious.
The Call Stack
JavaScript is single-threaded. One call stack, one piece of code executing at a time. When you call a function, it gets pushed onto the stack. When it returns, it gets popped off. If a function calls another function, that goes on top, and so on.
function multiply(a, b) {
return a * b
}
function square(n) {
return multiply(n, n)
}
function printSquare(n) {
const result = square(n)
console.log(result)
}
printSquare(5)The call stack at the deepest point looks like this:
multiply(5, 5)
square(5)
printSquare(5)
main()
Each function waits for the one above it to return before it can continue. This is synchronous execution. It's simple and predictable—until it isn't.
If the stack never unwinds, you get a stack overflow:
function recurse() {
recurse()
}
recurse()
// RangeError: Maximum call stack size exceededAnd if a function on the stack takes a long time, everything else waits. In the browser, that means the UI freezes—no clicks, no scrolling, no animations. In Node.js, it means no other requests get handled.
function blockFor(ms) {
const start = Date.now()
while (Date.now() - start < ms) {
// spinning
}
}
console.log('before')
blockFor(3000) // nothing else can run for 3 seconds
console.log('after')This is the fundamental constraint that the event loop exists to work around.
Runtime APIs: The Async Escape Hatch
The engine itself has no concept of timers, network requests, or file I/O. Those come from the runtime. When you call setTimeout, you're not telling the engine to wait—you're telling the runtime to set a timer and call you back when it fires.
Here's the mental model: the runtime provides APIs that accept a callback and do work outside the call stack. When the work is done, the callback gets placed in a queue. The event loop picks it up when the call stack is empty.
console.log('1')
setTimeout(() => {
console.log('2')
}, 0)
console.log('3')Output:
1
3
2
Even with a delay of 0, the setTimeout callback doesn't run immediately. It gets handed to the runtime's timer system, which places the callback in the macrotask queue. The event loop won't touch that queue until the current call stack is empty. So console.log('3') runs first.
This is the core insight: JavaScript doesn't do async by itself. The runtime does. The engine runs synchronous code. The runtime provides the mechanisms to schedule work for later. The event loop coordinates between them.
The Event Loop
The event loop is the scheduler. It's a continuous cycle that checks: is the call stack empty? If yes, are there microtasks? Run all of them. Then, is there a macrotask? Run one. Repeat.
Here's the sequence that made it click for me:
console.log('1: script start')
setTimeout(() => {
console.log('2: setTimeout')
}, 0)
Promise.resolve().then(() => {
console.log('3: promise')
})
queueMicrotask(() => {
console.log('4: microtask')
})
console.log('5: script end')Output:
1: script start
5: script end
3: promise
4: microtask
2: setTimeout
Walk through it:
console.log('1: script start')— runs immediately, it's synchronous.setTimeout(...)— the callback is registered with the runtime's timer. It goes into the macrotask queue.Promise.resolve().then(...)— the promise is already resolved, so the.thencallback goes into the microtask queue.queueMicrotask(...)— directly enqueues a microtask.console.log('5: script end')— runs immediately.- The call stack is now empty. The event loop drains the microtask queue:
'3: promise', then'4: microtask'. - Microtask queue is empty. The event loop picks up the next macrotask:
'2: setTimeout'.
The rule is simple: microtasks always run before the next macrotask. The entire microtask queue drains before the event loop moves on.
Microtasks vs Macrotasks
This distinction tripped me up for years. Here's the breakdown:
Microtasks:
Promise.then/catch/finallycallbacksqueueMicrotask()MutationObserver(browser)process.nextTick()(Node.js — technically its own queue, but similar priority)
Macrotasks:
setTimeout/setIntervalsetImmediate(Node.js)- I/O callbacks (file reads, network responses)
- UI rendering events (browser)
The key behavior: after every macrotask, the engine drains the entire microtask queue before running the next macrotask. This means microtasks can starve macrotasks:
function floodMicrotasks() {
let count = 0
function enqueue() {
if (count < 100_000) {
count++
queueMicrotask(enqueue)
}
}
enqueue()
setTimeout(() => {
console.log(`setTimeout ran after ${count} microtasks`)
}, 0)
}
floodMicrotasks()
// setTimeout ran after 100000 microtasksThe setTimeout callback sits in the macrotask queue the entire time, waiting for the microtask queue to drain. If the microtask queue never empties—say, each microtask enqueues another one indefinitely—the macrotask never runs. In the browser, that means the page freezes. No rendering, no input handling, nothing.
This is why you should never create unbounded microtask loops. If you need to do a lot of work in chunks, use setTimeout to yield back to the event loop between batches so other work (rendering, I/O) can proceed.
Async/Await Under the Hood
async/await is syntactic sugar over promises. An async function returns a promise. await pauses execution of that function and schedules the continuation as a microtask when the awaited value resolves. The function doesn't block the call stack—it yields.
These two are equivalent in terms of scheduling:
// Promise chain
function fetchUser(id) {
return fetch(`/api/users/${id}`)
.then((res) => res.json())
.then((user) => {
console.log(user.name)
return user
})
}
// Async/await
async function fetchUser(id) {
const res = await fetch(`/api/users/${id}`)
const user = await res.json()
console.log(user.name)
return user
}Both produce the same microtask behavior. Each .then or await boundary is a point where the function yields and the continuation enters the microtask queue.
Here's a more revealing example:
async function foo() {
console.log('foo start')
await Promise.resolve()
console.log('foo after await')
}
console.log('script start')
foo()
console.log('script end')Output:
script start
foo start
script end
foo after await
foo() runs synchronously up to the first await. The await Promise.resolve() suspends foo and schedules the rest as a microtask. Control returns to the caller, console.log('script end') runs, and then the microtask queue drains, running 'foo after await'.
Understanding this is critical for debugging. When you see code after an await running "later than expected," it's because await is a yield point. Everything after it is a microtask, not synchronous continuation.
Node.js Event Loop Specifics
The browser's event loop is relatively simple: microtasks, then one macrotask, then render, repeat. Node.js is more complex. Its event loop (powered by libuv) has distinct phases:
- Timers — executes
setTimeoutandsetIntervalcallbacks whose threshold has elapsed. - Pending callbacks — executes I/O callbacks deferred from the previous cycle (e.g., TCP errors).
- Idle/prepare — internal use only.
- Poll — retrieves new I/O events. Executes I/O callbacks (file reads, network data). This is where Node spends most of its time when idle.
- Check — executes
setImmediatecallbacks. - Close callbacks — executes close event callbacks (e.g.,
socket.on('close', ...)).
Between every phase, Node drains the microtask queue (promises, queueMicrotask). And before microtasks, it drains the process.nextTick queue.
process.nextTick vs queueMicrotask
process.nextTick is Node-specific and runs before promise microtasks:
Promise.resolve().then(() => console.log('promise'))
queueMicrotask(() => console.log('microtask'))
process.nextTick(() => console.log('nextTick'))
console.log('sync')Output in Node.js:
sync
nextTick
promise
microtask
process.nextTick has its own queue that drains before the microtask queue. This makes it the highest-priority async callback in Node. It's useful for ensuring a callback runs after the current operation completes but before any I/O or timers, but it carries the same starvation risk as microtasks—an unbounded nextTick loop will starve everything.
The Node.js docs themselves recommend queueMicrotask over process.nextTick for most use cases, because queueMicrotask is standard across runtimes and doesn't have the priority inversion footgun.
setImmediate vs setTimeout(fn, 0)
setTimeout(() => console.log('timeout'), 0)
setImmediate(() => console.log('immediate'))The order of these two is non-deterministic when called from the main module. It depends on how fast the process bootstraps and whether the timer threshold has elapsed by the time the event loop reaches the timers phase.
But inside an I/O callback, setImmediate always fires first:
import { readFile } from 'node:fs'
readFile(__filename, () => {
setTimeout(() => console.log('timeout'), 0)
setImmediate(() => console.log('immediate'))
})Output (always):
immediate
timeout
After the I/O callback runs (poll phase), the event loop moves to the check phase (setImmediate) before looping back to the timers phase (setTimeout). This is deterministic because the starting phase is known.
The Runtime Landscape Today
The JavaScript runtime ecosystem has expanded significantly. Here's how the major players compare:
Node.js — The original server-side runtime. V8 engine, libuv event loop, massive npm ecosystem. CommonJS and ES modules. The default choice for most backend JavaScript.
Deno — Created by Ryan Dahl (who also created Node.js) to fix what he saw as Node's design mistakes. V8 engine, TypeScript support out of the box, permissions-based security model, web-standard APIs (fetch, Request, Response are built in). No node_modules by default, though it now supports npm packages.
Bun — Built on JavaScriptCore (Safari's engine) instead of V8. Focuses on speed: faster startup, faster installs, faster test runner. Aims to be a drop-in replacement for Node.js with built-in bundler, test runner, and package manager.
Browser runtimes — Each browser has its own engine (V8 in Chrome, SpiderMonkey in Firefox, JavaScriptCore in Safari) and provides the DOM, Web APIs, and a rendering pipeline. The event loop includes a rendering step that server runtimes don't have.
What they all share: single-threaded JavaScript execution, an event loop, microtask/macrotask scheduling, and the fundamental model of non-blocking I/O through callbacks and promises. The APIs differ, but the mental model transfers.
Practical Takeaways
After spending time understanding the runtime, here's what actually changed in how I write code:
Don't block the event loop. Any synchronous work that takes more than a few milliseconds is a problem. In the browser, it freezes the UI. In Node.js, it blocks every other request. If you have CPU-intensive work, offload it to a Worker thread (worker_threads in Node, Web Workers in the browser) or break it into chunks with setTimeout.
function processInChunks(items, chunkSize, processFn) {
let index = 0
function next() {
const end = Math.min(index + chunkSize, items.length)
for (let i = index; i < end; i++) {
processFn(items[i])
}
index = end
if (index < items.length) {
setTimeout(next, 0)
}
}
next()
}Know what's a microtask and what's a macrotask. When you're debugging timing issues—a callback fires before data is ready, a UI update doesn't appear when expected—the answer is almost always in the task queue ordering. Promises resolve as microtasks. Timers fire as macrotasks. Microtasks always go first.
Understand that await is a yield point. Code after await doesn't run synchronously. It's scheduled as a microtask. If you need something to happen synchronously, don't put it after an await.
Know which APIs belong to the runtime, not the language. setTimeout, fetch, console, fs, document—none of these are part of ECMAScript. They're runtime-provided. When you're writing code that needs to run in multiple environments (SSR, edge functions, service workers), this distinction determines what you can and can't use.
Use queueMicrotask over process.nextTick. It's cross-runtime, it's standard, and it doesn't have the priority inversion issues. Unless you specifically need to run before promise callbacks in Node.js, queueMicrotask is the better default.
The JavaScript runtime is a collaboration between the engine and the environment. The engine runs your code. The environment gives it something useful to do. The event loop makes sure they take turns. Once that model is clear, async JavaScript stops feeling like magic and starts feeling like a system you can reason about.