What await actually desugars to
Every senior FE dev has written async/await. Almost nobody can show you the generator and Promise chain underneath. Here's the desugaring straight from ECMA-262, ending with a 40-line implementation you can step through in a REPL.
async and await feel like magic. They're not. They're one of the most thoroughly documented rewrites in ECMA-262 — a syntactic skin over two things you already know: generators and Promises.
The desugaring isn't hidden either. It's spelled out in the "Await" abstract operation. Most blog posts trying to explain it stop at "it's kind of like a Promise.then chain". This post does the actual walk-through. By the end you'll have implemented await yourself in 18 lines of plain JavaScript — no transpiler, running today.
If you've read the previous post on the microtask checkpoint, you already know where an awaited continuation runs: in the microtask queue, drained by the checkpoint. This post is about what that continuation actually is.
await x compiles to three conceptual steps: (1) wrap x in Promise.resolve to get a real promise, (2) attach a .then that will resume the surrounding async function, (3) suspend the async function. "Resuming" is just calling .next(value) on a hidden generator that holds the function's state. The whole async/await feature is a generator with Promise plumbing autogenerated by the engine.
The big picture
Before we open the spec, here's the skeleton of what's happening. When the parser sees:
async function f() {
const a = await x;
return a + 1;
}It produces (conceptually) a generator-like function plus a driver:
function f() {
const gen = (function* () {
const a = yield x;
return a + 1;
})();
return runToCompletion(gen);
}runToCompletion is a small loop. It pulls the next yielded value from the generator, wraps it in Promise.resolve, attaches .then callbacks — one for fulfilment (calls gen.next(value)), one for rejection (calls gen.throw(err)) — and returns a Promise that resolves to the generator's final return value.
That's the whole feature. The engine does the rewriting for us. It also has a few optimisations we'll get to, but the shape is this.
Now let's see the spec prove it.
The ECMA-262 algorithm
Here's the Await abstract operation in current ECMA-262 — the evaluation semantics of an AwaitExpression. The real algorithm is 12 steps; the version below collapses the fulfilledClosure inner body and the symmetric rejected path so you can see the shape:
- Let asyncContext be the running execution context.
- Let promise be ? PromiseResolve(%Promise%, value).
- Let fulfilledClosure be a new Abstract Closure with parameters (v) that captures asyncContext and, when called, resumes asyncContext with NormalCompletion(v). (inner body elided)
- Let onFulfilled be CreateBuiltinFunction(fulfilledClosure, 1, "", « »).
- (
rejectedClosure+onRejected, symmetric to 3–4, omitted for space)- Perform PerformPromiseThen(promise, onFulfilled, onRejected).
- Remove asyncContext from the execution context stack; restore the previous running execution context.
- Resume the caller. If asyncContext is ever resumed again, let completion be the Completion Record with which it was resumed.
- Return completion.
Unpack that:
- Step 2 —
PromiseResolve(%Promise%, value). Whatever you pass toawaitbecomes a Promise.await 5wraps the5inPromise.resolve(5).await somePromisedoesn't double-wrap — that's the "thenable assimilation" rule. Either way, what we end up with is always a real Promise. - Steps 3–6 — the engine builds two closures. One resumes with a value; the other resumes with an error. These become the
onFulfilled/onRejectedhandlers attached via.then. - Step 7 —
PerformPromiseThen(promise, onFulfilled, onRejected). This is exactly whatPromise.prototype.thendoes under the hood. The continuation is now pending in the microtask queue, waiting forpromiseto settle. - Steps 8–12 — the async function gives up its frame on the call stack. The caller resumes synchronously. When the
onFulfilled/onRejectedclosures fire later, they resume asyncContext with a Completion Record. Steps 10/12 then return that Completion as the result of theawait.
Read step 3 one more time. The closure captures asyncContext and later resumes it with a normal (or throw) completion. That's what makes async functions look like they "pause". They don't pause. They suspend — state preserved, plus a pre-built callback that knows how to resume them. Exactly like a generator paused at a yield.
Where that resumption runs
Step 7 performs PerformPromiseThen, which eventually queues a PromiseReactionJob — an ECMAScript Job in the strict language-spec sense.
Jobs enter the host's microtask queue through HostEnqueuePromiseJob. That's a host hook defined by ECMA-262 and implemented by whichever runtime embeds V8. In browsers, HTML defines the hook as "enqueue onto the surrounding agent's event loop's microtask queue" (HTML — HostEnqueuePromiseJob).
(Note: this is specifically how Promise.then / await reach the microtask queue. The HTML-level queueMicrotask() API takes a different path — it enqueues a microtask directly, without going through the ECMAScript Jobs machinery.)
So the continuation after await runs in the same microtask drain as Promise.then callbacks. That's the connection to the previous post — the event-loop checkpoint is where everything await-related actually executes.
In that animation, every await … in my Locus dashboards is one of those blue await Ingka.acquire(Locus) tokens — it enters Web APIs (waiting on the real I/O), and when the underlying promise settles, it drops into the microtask queue, gets drained, and pushes the async function back onto the call stack to continue.
What V8 actually emits
Since V8 v7.2 (Chrome 72, late 2018), V8 ships an optimised implementation of async functions. It skips the extra microtask hops a naïve desugaring would introduce. The V8 team's writeup is here: Faster async functions and promises.
The fast path kicks in when await x sees that x is already a native Promise. V8 skips the extra promise that PerformPromiseThen would otherwise allocate as the result of the implicit .then, and resumes the async function directly once x settles. The V8 blog quantifies the win as going from 3 microtask ticks down to 1.
One of the Torque/CSA builtins that implements the resumption side:
TF_BUILTIN(AsyncFunctionAwaitResolveClosure, AsyncFunctionBuiltinsAssembler) {
CSA_DCHECK_JS_ARGC_EQ(this, 1);
const auto sentValue = Parameter<Object>(Descriptor::kSentValue);
const auto context = Parameter<Context>(Descriptor::kContext);
AsyncFunctionAwaitResumeClosure(context, sentValue, JSGeneratorObject::kNext);
Return(UndefinedConstant());
}The closure fires once the awaited Promise fulfils. AsyncFunctionAwaitResumeClosure (a helper elsewhere in the same file) eventually calls CallBuiltin(Builtin::kResumeGeneratorTrampoline, …) to restore the async function's execution context from where it suspended. The spec's "suspend asyncContext / resume asyncContext" is implemented by turning the async function into a JSGeneratorObject at the bytecode level. So the "literal desugaring" really does have a generator inside it. V8 just synthesizes it for you instead of you authoring it.
(The rejection path lives in AsyncFunctionAwaitRejectClosure, which resumes with JSGeneratorObject::kThrow instead of kNext — surfacing the failure as a throw inside the async function body. Go poke around the file; it's only ~300 lines.)
Implementing await yourself
Here's the payoff. Given what we know, we can write runToCompletion by hand. This is the actual semantics of async functions:
function runAsync(generatorFn) {
return function (...args) {
const gen = generatorFn.apply(this, args);
return new Promise((resolve, reject) => {
function step(method, arg) {
let result;
try {
result = gen[method](arg);
} catch (err) {
return reject(err);
}
if (result.done) {
return resolve(result.value);
}
// Mirrors Await step 2 — PromiseResolve.
Promise.resolve(result.value).then(
(v) => step("next", v), // onFulfilled (step 3–4)
(e) => step("throw", e) // onRejected (step 5–6)
);
}
step("next", undefined);
});
};
}That's 18 lines doing the work of async. Now let's use it:
const f = runAsync(function* () {
const a = yield Promise.resolve(2);
const b = yield Promise.resolve(3);
return a + b;
});
f().then((result) => console.log(result)); // 5Run that in a Node REPL or browser console. It works.
The pattern runAsync(function* () { … yield … }) is a faithful model of async function () { … await … }. Replace yield with await, wrap the function with async, and you get the same observable semantics. V8's actual implementation is slightly fancier — since V8 7.2 it skips an extra PromiseResolve allocation on the fast path (see below). The shape above is still the right mental model.
This pattern has a name: it's "coroutines with a trampoline". Node.js's co library (2014) did exactly this — it was the community's async/await before async/await. Read its source if you want to see the full version including thenable assimilation, array-of-promises, and multi-mode yields.
Three things the desugaring tells you
1. await on a non-Promise still yields to microtasks
Because of step 2 (PromiseResolve), even awaiting a primitive suspends:
1async function f() {2const x = await 5; // 5 gets Promise.resolve-wrapped3console.log("async:", x);4}5f();6console.log("sync");You suspended. A microtask was queued. The main line ran first. Only then the continuation fired.
2. return in an async function is Promise.resolve
Because runToCompletion does resolve(result.value) on done, returning a value from an async function wraps it in a Promise:
1async function f() {2return 42;3}4 5console.log("typeof f() →", typeof f());6console.log("f() instanceof Promise →", f() instanceof Promise);7console.log("await f() →", await f());And symmetrically: throw in an async function becomes Promise rejection.
3. Sequential awaits are actually sequential
async function slow() {
const a = await fetch("/a");
const b = await fetch("/b"); // doesn't start until /a fulfils
return [a, b];
}Each yield/await in the generator only proceeds after the previous Promise settled. That's why Promise.all([fetchA(), fetchB()]) is genuinely faster — the fetches are already in-flight before any await suspends anything.
Here's the cost, in milliseconds. Same work, different scheduling:
1const delay = (ms, label) => new Promise(r => setTimeout(() => r(label), ms));2 3// Sequential: total ≈ 150ms (50 + 50 + 50)4const t1 = performance.now();5const a = await delay(50, "a");6const b = await delay(50, "b");7const c = await delay(50, "c");8console.log("sequential:", Math.round(performance.now() - t1), "ms →", [a, b, c]);9 10// Parallel: total ≈ 50ms (max of the three)11const t2 = performance.now();12const [x, y, z] = await Promise.all([13delay(50, "x"),14delay(50, "y"),15delay(50, "z"),16]);17console.log("parallel: ", Math.round(performance.now() - t2), "ms →", [x, y, z]);Classic mistake. Easier to spot once you see it as the generator you now know it is.
The edge case nobody covers: re-entrance
One subtle thing the spec handles and every hand-rolled runAsync forgets: what happens if an await's Promise was already settled?
Walk through it step by step — the A, B, C order is the visual proof:
async function f() {
console.log("A");
await Promise.resolve();
console.log("C");
}
f();
console.log("B");1async function f() {2await Promise.resolve(); // Promise is ALREADY fulfilled3console.log("B");4}5f();6console.log("A");You might expect "B, A" because the Promise is already fulfilled. But the microtask still runs after the current synchronous block — because PerformPromiseThen always enqueues a reaction job, even for already-settled promises. There's no "fast path" that skips the queue. The PerformPromiseThen call in Await's step 7 is unconditional.
This matters for libraries: you can't write code that "depends on" a cached Promise being synchronous. It never is. Await always yields, period. That's one of the most valuable invariants in JavaScript.
(In post-V8-7.2 engines the fast path for await x where x is already a native Promise eliminates intermediate promise allocations and, per the V8 blog post, collapses the pre-resumption microtask hop count from 3 ticks down to 1. The yield to a microtask still happens — one tick is the minimum — but the two extra hops the older spec mandated are gone. Treat that as engine-level detail; at the language level, await always yields to a microtask is still the invariant to program against.)
Where this shows up in production
At Locus, we have a config-driven UI that loads schemas asynchronously and renders forms. A naive version:
async function renderForm(client) {
const schema = await loadSchema(client); // network
const theme = await loadTheme(client); // network
const perms = await loadPermissions(client); // network
return <Form schema={schema} theme={theme} perms={perms} />;
}Each await is suspend → microtask → resume. Serial. For a dashboard rendering 50 forms, that's 150 sequential network trips instead of 50 parallel ones.
The fix is to kick off the promises before suspending:
async function renderForm(client) {
const [schema, theme, perms] = await Promise.all([
loadSchema(client),
loadTheme(client),
loadPermissions(client),
]);
return <Form schema={schema} theme={theme} perms={perms} />;
}Now: one await, one suspend, all three requests in flight. The engine still generates the generator; now it only yields once.
This is an n-fold speedup you can only see once you've internalized that each await is literally a yield waiting on a Promise.
Bonus — the full trace, with real pending work
The earlier visualizer showed await Promise.resolve() — an already-settled promise. No Web APIs involved. Continuation goes straight to the microtask queue because there's nothing to wait on.
But most of the awaits you actually write are on pending promises — a fetch(), a setTimeout-backed wait, an I/O operation. Those look different in the runtime:
- The pending promise parks its async work inside a Web API (network stack, timer subsystem, file I/O).
- Control returns to the event loop. Main thread keeps working on other tasks.
- When the Web API's work completes, it fires a task (a macrotask — because it's an I/O / timer callback) that eventually resolves the promise.
- Then the continuation past
awaitgets queued as a microtask.
The path: stack → Web APIs → macrotask queue → microtask queue → stack. Four hops, not two. Watch the token journey:
async function load() {
console.log("A");
await new Promise(r => setTimeout(r, 50));
console.log("D");
}
load();
console.log("B");And the live version — the "~50ms" gap between B and D is the timer in flight inside Web APIs:
1async function load() {2const t0 = performance.now();3console.log("A — sync inside load");4 5await new Promise(r => setTimeout(r, 50));6 7console.log("D — resumed after", (performance.now() - t0).toFixed(0), "ms");8}9 10load();11console.log("B — sync from script");The one-line difference between the two cases:
await <already-settled>→ no Web API trip.PerformPromiseThensees a fulfilled promise, queues the reaction as a microtask immediately.await <pending>→ the pending work lives in Web APIs. When it settles (via a task), that task's callback resolves the promise, which then queues the reaction as a microtask.
Same mechanism on the microtask side. The difference is whether there's real work to wait on.
Next up
Next in The Runtime series: "setTimeout(fn, 0) is never 0ms." We'll trace the clamping rules in HTML's Timers section, look at V8's timer backend, and explain why backgrounded tabs throttle to 1000ms.
Primary sources: