This blog post explains the ECMAScript proposal “Asynchronous Iteration” by Domenic Denicola and Kevin Smith. It has reached stage 4 on 2018-01-25 and is part of ECMAScript 2018.
With ECMAScript 6, JavaScript got built-in support for synchronously iterating over data. But what about data that is delivered asynchronously? For example, lines of text, read asynchronously from a file or an HTTP connection.
This proposal brings support for that kind of data. Before we go into it, let’s first recap synchronous iteration.
Synchronous iteration was introduced with ES6 and works as follows:
Symbol.iterator
.[Symbol.iterator]()
on an iterable. It wraps each iterated element in an object and returns it via its method next()
– one at a time.next()
. Property value
contains an iterated element, property done
is true
after the last element (value
can usually be ignored then; it’s almost always undefined
).I’ll demonstrate via an Array:
> const iterable = ['a', 'b'];
> const iterator = iterable[Symbol.iterator]();
> iterator.next()
{ value: 'a', done: false }
> iterator.next()
{ value: 'b', done: false }
> iterator.next()
{ value: undefined, done: true }
The problem is that the previously explained way of iterating is synchronous, it doesn’t work for asynchronous sources of data. For example, in the following code, readLinesFromFile()
cannot deliver its asynchronous data via synchronous iteration:
for (const line of readLinesFromFile(fileName)) {
console.log(line);
}
The proposal specifies a new protocol for iteration that works asynchronously:
Symbol.asyncIterator
.next()
of an async iterator returns Promises for IteratorResults (vs. IteratorResults directly).You may wonder whether it would be possible to instead use a synchronous iterator that returns one Promise for each iterated element. But that is not enough – whether or not iteration is done
is generally determined asynchronously.
Using an asynchronous iterable looks as follows. Function createAsyncIterable()
is explained later. It converts its synchronously iterable parameter into an async iterable.
const asyncIterable = createAsyncIterable(['a', 'b']);
const asyncIterator = asyncIterable[Symbol.asyncIterator]();
asyncIterator.next()
.then(iterResult1 => {
console.log(iterResult1); // { value: 'a', done: false }
return asyncIterator.next();
})
.then(iterResult2 => {
console.log(iterResult2); // { value: 'b', done: false }
return asyncIterator.next();
})
.then(iterResult3 => {
console.log(iterResult3); // { value: undefined, done: true }
});
Within an asynchronous function, you can process the results of the Promises via await
and the code becomes simpler:
async function f() {
const asyncIterable = createAsyncIterable(['a', 'b']);
const asyncIterator = asyncIterable[Symbol.asyncIterator]();
console.log(await asyncIterator.next());
// { value: 'a', done: false }
console.log(await asyncIterator.next());
// { value: 'b', done: false }
console.log(await asyncIterator.next());
// { value: undefined, done: true }
}
In TypeScript notation, the interfaces look as follows.
interface AsyncIterable {
[Symbol.asyncIterator]() : AsyncIterator;
}
interface AsyncIterator {
next() : Promise<IteratorResult>;
}
interface IteratorResult {
value: any;
done: boolean;
}
for-await-of
The proposal also specifies an asynchronous version of the for-of
loop: for-await-of
:
async function f() {
for await (const x of createAsyncIterable(['a', 'b'])) {
console.log(x);
}
}
// Output:
// a
// b
for-await-of
and rejections Similarly to how await
works in async functions, the loop throws an exception if next()
returns a rejection:
function createRejectingIterable() {
return {
[Symbol.asyncIterator]() {
return this;
},
next() {
return Promise.reject(new Error('Problem!'));
},
};
}
(async function () { // (A)
try {
for await (const x of createRejectingIterable()) {
console.log(x);
}
} catch (e) {
console.error(e);
// Error: Problem!
}
})(); // (B)
Note that we have just used an Immediately Invoked Async Function Expression (IIAFE, pronounced “yaffee”). It starts in line (A) and ends in line (B). We need to do that because for-of-await
doesn’t work at the top level of modules and scripts. It does work everywhere where await
can be used. Namely, in async functions and async generators (which are explained later).
for-await-of
and sync iterables for-await-of
can also be used to iterate over sync iterables:
(async function () {
for await (const x of ['a', 'b']) {
console.log(x);
}
})();
// Output:
// a
// b
for-await-of
converts each iterated value via Promise.resolve()
to a Promise, which it then awaits. That means that it works for both Promises and normal values.
Normal (synchronous) generators help with implementing synchronous iterables. Asynchronous generators do the same for asynchronous iterables.
For example, we have previously used the function createAsyncIterable(syncIterable)
which converts a syncIterable
into an asynchronous iterable. This is how you would implement this function via an async generator:
async function* createAsyncIterable(syncIterable) {
for (const elem of syncIterable) {
yield elem;
}
}
Note the asterisk after function
:
function
.How do async generators work?
genObj
. Each invocation genObj.next()
returns an object {value,done}
that wraps a yielded value.genObj
. Each invocation genObj.next()
returns a Promise for an object {value,done}
that wraps a yielded value.next()
invocations The JavaScript engine internally queues invocations of next()
and feeds them to an async generator once it is ready. That is, after calling next()
, you can call again, right away; you don’t have to wait for the Promise it returns to be settled. In most cases, though, you do want to wait for the settlement, because you need the value of done
in order to decide whether to call next()
again or not. That’s how the for-await-of
loop works.
Use cases for calling next()
several times without waiting for settlements include:
Use case: Retrieving Promises to be processed via Promise.all()
. If you know how many elements there are in an async iterable, you don’t need to check done
.
const asyncGenObj = createAsyncIterable(['a', 'b']);
const [{value:v1},{value:v2}] = await Promise.all([
asyncGenObj.next(), asyncGenObj.next()
]);
console.log(v1, v2); // a b
Use case: Async generators as sinks for data, where you don’t always need to know when they are done.
const writer = openFile('someFile.txt');
writer.next('hello'); // don’t wait
writer.next('world'); // don’t wait
await writer.return(); // wait for file to close
Acknowledgement: Thanks to @domenic and @zenparsing for these use cases.
await
in async generators You can use await
and for-await-of
inside async generators. For example:
async function* prefixLines(asyncIterable) {
for await (const line of asyncIterable) {
yield '> ' + line;
}
}
One interesting aspect of combining await
and yield
is that await
can’t stop yield
from returning a Promise, but it can stop that Promise from being settled:
async function* asyncGenerator() {
console.log('Start');
const result = await doSomethingAsync(); // (A)
yield 'Result: '+result; // (B)
console.log('Done');
}
Let’s take a closer look at line (A) and (B):
yield
in line (B) fulfills a Promise. That Promise is returned by next()
immediately.await
(the Promise returned by doSomethingAsync()
in line (A)) must be fulfilled.That means that these two lines correspond (roughly) to this code:
return new Promise((resolve, reject) => {
doSomethingAsync()
.then(result => {
resolve({
value: 'Result: '+result,
done: false,
});
});
});
If you want to dig deeper – this is a rough approximation of how async generators work:
const BUSY = Symbol('BUSY');
const COMPLETED = Symbol('COMPLETED');
function asyncGenerator() {
const settlers = [];
let step = 0;
return {
[Symbol.asyncIterator]() {
return this;
},
next() {
return new Promise((resolve, reject) => {
settlers.push({resolve, reject});
this._run();
});
}
_run() {
setTimeout(() => {
if (step === BUSY || settlers.length === 0) {
return;
}
const currentSettler = settlers.shift();
try {
switch (step) {
case 0:
step = BUSY;
console.log('Start');
doSomethingAsync()
.then(result => {
currentSettler.resolve({
value: 'Result: '+result,
done: false,
});
// We are not busy, anymore
step = 1;
this._run();
})
.catch(e => currentSettler.reject(e));
break;
case 1:
console.log('Done');
currentSettler.resolve({
value: undefined,
done: true,
});
step = COMPLETED;
this._run();
break;
case COMPLETED:
currentSettler.resolve({
value: undefined,
done: true,
});
this._run();
break;
}
}
catch (e) {
currentSettler.reject(e);
}
}, 0);
}
}
}
This code assumes that next()
is always called without arguments. A complete implementation would have to queue arguments, too.
yield*
in async generators yield*
in async generators works analogously to how it works in normal generators – like a recursive invocation:
async function* gen1() {
yield 'a';
yield 'b';
return 2;
}
async function* gen2() {
const result = yield* gen1(); // (A)
// result === 2
}
In line (A), gen2()
calls gen1()
, which means that all elements yielded by gen1()
are yielded by gen2()
:
(async function () {
for await (const x of gen2()) {
console.log(x);
}
})();
// Output:
// a
// b
The operand of yield*
can be any async iterable. Sync iterables are automatically converted to async iterables, just like for for-await-of
.
In normal generators, next()
can throw exceptions. In async generators, next()
can reject the Promise it returns:
async function* asyncGenerator() {
// The following exception is converted to a rejection
throw new Error('Problem!');
}
asyncGenerator().next()
.catch(err => console.log(err)); // Error: Problem!
Converting exceptions to rejections is similar to how async functions work.
Async function:
return
and rejected via throw
.(async function () {
return 'hello';
})()
.then(x => console.log(x)); // hello
(async function () {
throw new Error('Problem!');
})()
.catch(x => console.error(x)); // Error: Problem!
Async generator function:
next()
returns a Promise. yield x
fulfills the “current” Promise with {value: x, done: false}
. throw err
rejects the “current” Promise with err
.async function* gen() {
yield 'hello';
}
const genObj = gen();
genObj.next().then(x => console.log(x));
// { value: 'hello', done: false }
The source code for the examples is available via the repository async-iter-demo
on GitHub.
The example repo uses babel-node
to run its code. This is how it configures Babel in its package.json
:
{
"dependencies": {
"babel-preset-es2015-node": "···",
"babel-preset-es2016": "···",
"babel-preset-es2017": "···",
"babel-plugin-transform-async-generator-functions": "···"
···
},
"babel": {
"presets": [
"es2015-node",
"es2016",
"es2017"
],
"plugins": [
"transform-async-generator-functions"
]
},
···
}
Function takeAsync()
collects all elements of asyncIterable
in an Array. I don’t use for-await-of
in this case, I invoke the async iteration protocol manually. I also don’t close asyncIterable
if I’m finished before the iterable is done
.
/**
* @returns a Promise for an Array with the elements
* in `asyncIterable`
*/
async function takeAsync(asyncIterable, count=Infinity) {
const result = [];
const iterator = asyncIterable[Symbol.asyncIterator]();
while (result.length < count) {
const {value,done} = await iterator.next();
if (done) break;
result.push(value);
}
return result;
}
This is the test for takeAsync()
:
test('Collect values yielded by an async generator', async function() {
async function* gen() {
yield 'a';
yield 'b';
yield 'c';
}
assert.deepStrictEqual(await takeAsync(gen()), ['a', 'b', 'c']);
assert.deepStrictEqual(await takeAsync(gen(), 3), ['a', 'b', 'c']);
assert.deepStrictEqual(await takeAsync(gen(), 2), ['a', 'b']);
assert.deepStrictEqual(await takeAsync(gen(), 1), ['a']);
assert.deepStrictEqual(await takeAsync(gen(), 0), []);
});
Note how nicely async functions work together with the mocha test framework: for asynchronous tests, the second parameter of test()
can return a Promise.
The example repo also has an implementation for an asynchronous queue, called AsyncQueue
. It’s implementation is relatively complex, which is why I don’t show it here. This is the test for AsyncQueue
:
test('Enqueue before dequeue', async function() {
const queue = new AsyncQueue();
queue.enqueue('a');
queue.enqueue('b');
queue.close();
assert.deepStrictEqual(await takeAsync(queue), ['a', 'b']);
});
test('Dequeue before enqueue', async function() {
const queue = new AsyncQueue();
const promise = Promise.all([queue.next(), queue.next()]);
queue.enqueue('a');
queue.enqueue('b');
return promise.then(arr => {
const values = arr.map(x => x.value);
assert.deepStrictEqual(values, ['a', 'b']);
});
});
Let’s implement code that reads text lines asynchronously. We’ll do it in three steps.
Step 1: read text data in chunks via the Node.js ReadStream
API (which is based on callbacks) and push it into an AsyncQueue
(which was introduced in the previous section).
/**
* Creates an asynchronous ReadStream for the file whose name
* is `fileName` and feeds it into an AsyncQueue that it returns.
*
* @returns an async iterable
*/
function readFile(fileName) {
const queue = new AsyncQueue();
const readStream = createReadStream(fileName,
{ encoding: 'utf8', bufferSize: 1024 });
readStream.on('data', buffer => {
const str = buffer.toString('utf8');
queue.enqueue(str);
});
readStream.on('end', () => {
// Signal end of output sequence
queue.close();
});
return queue;
}
Step 2: Use for-await-of
to iterate over the chunks of text and yield
lines of text.
/**
* Turns a sequence of text chunks into a sequence of lines
* (where lines are separated by newlines)
*
* @returns an async iterable
*/
async function* splitLines(chunksAsync) {
let previous = '';
for await (const chunk of chunksAsync) {
previous += chunk;
let eolIndex;
while ((eolIndex = previous.indexOf('\n')) >= 0) {
const line = previous.slice(0, eolIndex);
yield line;
previous = previous.slice(eolIndex+1);
}
}
if (previous.length > 0) {
yield previous;
}
}
Step 3: combine the two previous functions. We first feed chunks of text into a queue
via readFile()
and then convert that queue
into an async iterable over lines of text via splitLines()
.
/**
* @returns an async iterable
*/
function readLines(fileName) {
// `queue` is an async iterable
const queue = readFile(fileName);
return splitLines(queue);
}
Lastly, this is how you’d use readLines()
from within a Node.js script:
(async function () {
const fileName = process.argv[2];
for await (const line of readLines(fileName)) {
console.log('>', line);
}
})();
WHATWG streams are async iterables, meaning that you can use for-await-of
to process them:
const rs = openReadableStream();
for await (const chunk of rs) {
···
}
The spec introduces several new concepts and entities:
AsyncIterable
and AsyncIterator
%AsyncGenerator%
, %AsyncFromSyncIteratorPrototype%
, %AsyncGeneratorFunction%
, %AsyncGeneratorPrototype%
, %AsyncIteratorPrototype%
.Symbol.asyncIterator
No new global variables are introduced by this feature.
If you want to understand how async generators work, it’s best to start with Sect. “AsyncGenerator Abstract Operations”. They key to understanding async generators is to understand how queuing works.
Two internal properties of async generator objects play important roles w.r.t. queuing:
[[AsyncGeneratorState]]
contains the state the generator is currently in: "suspendedStart"
, "suspendedYield"
, "executing"
, "completed"
(it is undefined
before it is fully initialized)[[AsyncGeneratorQueue]]
holds pending invocations of next/throw/return
. Each queue entry contains two fields:
[[Completion]]
: the parameter of next()
, throw()
or return()
that lead to the entry being enqueued. The type of the completion (normal
, throw
, return
) indicates which method call created the entry and determines what happens after dequeuing.[[Capability]]
: the PromiseCapability
of the pending Promise.The queue is managed mainly via two operations:
Enqueuing happens via AsyncGeneratorEnqueue()
. This is the operation that is called by next()
, return()
and throw()
. It adds an entry to the AsyncGeneratorQueue. Then AsyncGeneratorResumeNext()
is called, but only if the generator’s state isn’t "executing"
:
next()
, return()
or throw()
from inside itself then the effects of that call will be delayed.await
leads to a suspension of the generator, but its state remains "executing"
. Hence, it will not be resumed by AsyncGeneratorEnqueue()
.Dequeuing happens via AsyncGeneratorResumeNext()
. AsyncGeneratorResumeNext()
is invoked after enqueuing, but also after settling a queued Promise (e.g. via yield
), because there may now be new queued pending Promises, allowing execution to continue. If the queue is empty, return immediately. Otherwise, the current Promise is the first element of the queue:
yield
, it is resumed and continues to run. The current Promise is later settled via AsyncGeneratorResolve()
or AsyncGeneratorReject()
.AsyncGeneratorResolve()
and AsyncGeneratorReject()
itself, meaning that all queued pending Promises will eventually be settled.To get an async iterator from an object iterable
, you call GetIterator(iterable, async)
(async
is a symbol). If iterable
doesn’t have a method [Symbol.asyncIterator]()
, GetIterator()
retrieves a sync iterator via method iterable[Symbol.iterator]()
and converts it to an async iterator via CreateAsyncFromSyncIterator()
.
for-await-of
loop for-await-of
works almost exactly like for-of
, but there is an await
whenever the contents of an IteratorResult are accessed. You can see that by looking at Sect. “Runtime Semantics: ForIn/OfBodyEvaluation”. Notably, iterators are closed similarly, via IteratorClose()
, towards the end of this section.
Let’s look at two alternatives to async iteration for processing async data.
The following code demonstrates the CSP library js-csp
:
var csp = require('js-csp');
function* player(name, table) {
while (true) {
var ball = yield csp.take(table); // dequeue
if (ball === csp.CLOSED) {
console.log(name + ": table's gone");
return;
}
ball.hits += 1;
console.log(name + " " + ball.hits);
yield csp.timeout(100); // wait
yield csp.put(table, ball); // enqueue
}
}
csp.go(function* () {
var table = csp.chan(); // (A)
csp.go(player, ["ping", table]); // (B)
csp.go(player, ["pong", table]); // (C)
yield csp.put(table, {hits: 0}); // enqueue
yield csp.timeout(1000); // wait
table.close();
});
player
defines a “process” that is instantiated twice (in line (B) and in line (C), via csp.go()
). The processes are connected via the “channel” table
, which is created in line (A) and passed to player
via its second parameter. A channel is basically a queue.
How does CSP compare to async iteration?
The following code demonstrates Reactive Programming via the JavaScript library RxJS:
const button = document.querySelector('button');
Rx.Observable.fromEvent(button, 'click') // (A)
.throttle(1000) // at most one event per second
.scan(count => count + 1, 0)
.subscribe(count => console.log(`Clicked ${count} times`));
In line (A), we create a stream of click events via fromEvent()
. These events are then filtered so that there is at most one event per second. Every time there is an event, scan()
counts how many events there have been, so far. In the last line, we log all counts.
How does Reactive Programming compare to async iteration?
throttle()
) works well for many push-based data sources (DOM events, server-sent events, etc.).There is an ECMAScript proposal for Reactive Programming, called “Observable” (by Jafar Husain).
Now that I’ve used asynchronous iteration a little, I’ve come to a few conclusions. These conclusions are evolving, so let me know if you disagree with anything.
Promises have become the primitive building block for everything async in JavaScript. And it’s a joy to see how everything is constantly improving: more and more APIs are using Promises, stack traces are getting better, performance is increasing, using them via async functions is great, etc.
We need some kind of support for asynchronous sequences of data. It wouldn’t necessarily have to be a language mechanism; it could just be part of the standard library. At the moment, the situation is similar to one-time async data before Promises: various patterns exist, but there is no clear standard and interoperability.
Async iteration brings with it considerable additional cognitive load:
Operations are missing: Sync iterables can be converted to Arrays via the spread operator and accessed via Array destructuring. There are no equivalent operations for async iterables.
Converting legacy APIs to async iteration isn’t easy. A queue that is asynchronously iterable helps, but the pieces don’t fit together as neatly as I would like. In comparison, going from callbacks to Promises is quite elegant.
Background: