Async generators use an internal queue to handle synchronous next, thrown, and return methods calls.
I was trying to construct a situation where this queue is mandatory for the success of the iteration itself. Therefore, I'm looking for some cases where a manual implementation of the async iteration interfaces, without a custom reimplementation of the queue, is not enough.
The following is an example but not so good, because the general time consistency is not maintained but the iteration result is correct at each step:
function aItsFactory() {
let i = 1;
return {
async next() {
if(i > 5) return Promise.resolve({ value: void 0, done: true });
const res = await fetch(`https://jsonplaceholder.typicode.com/posts/${i++}`).then(x => x.json());
return Promise.resolve({ value: res, done: false });
},
[Symbol.asyncIterator]() {
return this;
}
}
}
const ait = aItsFactory();
// general time consistency is lost, because e.g. the fourth call
// is started with the previous three and it could end before the others.
// But the 'i' state is correctly shared so the fifth call
// is correctly requesting the element number five to the source
// and the last call will correctly receive { done: true }
;(async () => {
ait.next();
ait.next();
ait.next();
ait.next();
console.log(await ait.next()); // { done: false, value: { userId: 1, id: 5, title: ... } }
console.log(await ait.next()); // { done: true, value: undefined }
})();
It could be argued that without a proper queue the iteration concept itself will be lost. That is because of the active parallel next calls.
Anyway, I'd like to find some examples, also trivial ones, which make clear that async generators are a better approach for creating well-formed async iterables than a manual implementation of the async iteration interfaces.
------ Edit ------
Let's talk about an improved situation:
function aItsFactory() {
let i = 1;
let done = false;
return {
async next() {
if (done) return Promise.resolve({
done: true,
value: undefined
});
const res = await fetch(`https://jsonplaceholder.typicode.com/posts/${i++}`).then(x => x.json());
if (Object.keys(res).length === 0) { // the jsonplaceholder source is out of bounds
done = true;
return Promise.resolve({
done: true,
value: undefined
});
} else {
return Promise.resolve({
done: false,
value: res
});
};
},
[Symbol.asyncIterator]() {
return this;
}
}
}
const ait = aItsFactory();
// now lot of sync call to 'ait.next'
Here the done
resolution is fully asynchronous.
From an async iteration perspective, the code is wrong because each next
call should be forced to await
the outcome of the previous to know if it was the last valid iteration. In such a case, the current next
should do nothing, immediately returning Promise.resolve({done:true, value:undefined})
.
This is only possible thanks to a queue of sync next
calls.
But in practice the major risk of going out-of-bounds, calling ait.next()
repeatedly, are some useless AJAX request.
Don't misunderstand me, I'm not saying that we can turn a blind eye.
The point is that each step of the async iteration itself will never be broken.
I'd like to see a situation, not too unrealistic, where the iteration itself could be compromised at each step if all the next calls are not enqueued.