-1

I know what asynchronous iteration is. But ever since MDN & node.js 10 introduced it in 2018 I have yet to see its practical use, e.g. what kind of problem would be better solved by it? A common example I see is to rewrite the callback fs.createReadStream to for await (const chunk of stream)

Another example I can think of is to sending paginated requests in sequence.

But these 2 examples are not compelling enough to convince me using for-await-of. I just can not think of a scenario that I have an array of promises that I need to resolve them one by one.

So can someone give some practical example of using asynchronous iteration to solve a problem better than without it?

Qiulang
  • 10,295
  • 11
  • 80
  • 129
  • I know what you mean, just like callback hell, an async thing depends of the result of another async thing. But my problem is I can't think of a scenario that I can get all promises beforehand and for await of them one by one. – Qiulang Apr 16 '21 at 02:21
  • @danh check the answer I got here https://github.com/tc39/proposal-async-iteration/issues/135 – Qiulang Apr 16 '21 at 04:12
  • 1
    "*a scenario where I have an array of promises*" - it's too late when you already have all the promises (a fixed number of them) in an array; in that case, [you should not use `for await … of`](https://stackoverflow.com/q/60706179/1048572). Asynchronous iteration is only useful where the promises are *created* one after another, e.g. when it depends on the result of the previous item, and it allows you to start processing the first without seeing the end. – Bergi Apr 18 '21 at 15:10
  • "*These examples are not compelling enough to convince me using `for-await-of`.*" - can you show us how you would write code for those without `for await … of`? Maybe it is easier to explain the difference then. But yes, it's mostly just syntactic sugar, you can write any such loop by manually iterating over a structure with `await getNext()` - just like `for … of` is just syntactic sugar, and like every `for` loop can be rewritten as `while`. – Bergi Apr 18 '21 at 15:16
  • @Bergi "it's too late when you already have all the promises" I asked this question first before I saw your answer there. But for your answer there I don't have a clear idea why it is too late then if I really want to resolve them one by one ? – Qiulang Apr 18 '21 at 15:22
  • 1
    @Qiulang Because promises are not resolved from the outside, they don't "start resolving when you await them" - the tasks you execute start when you create the promise, the promise just represents the result. So "*resolve them one by one*" makes no sense. You can only "***create** them one by one*". – Bergi Apr 18 '21 at 15:24
  • Oh right! Thanks!! – Qiulang Apr 18 '21 at 15:26

1 Answers1

0

I am giving an answer by just quoting words from @Bergi's comments and from the comments I got from https://github.com/tc39/proposal-async-iteration/issues/135. Without an answer my question will be auto-deleted because it has got down-voted.

@Bergi comments from his answer https://stackoverflow.com/a/60707105/301513

for await … of is useful where the sequence itself is generated asynchronously (and where its length is not known beforehand to the consumer). A perfect example is paginated fetching of a list from an api. Less perfect would be reading files from a large directory - while one could write that code in two steps (getting an array of filenames, then iterating that), sometimes you prefer an api surface that has them packed in one: an asynchronous iterator where you always get filename and contents together.

https://github.com/tc39/proposal-async-iteration/issues/135 gives an example for reading files from a large directory,

import fs from "fs/promises";

async function* findAllFiles(directory: string) {
    const dirEntries = await fs.readdir(directory, { withFileTypes: true });
    for (const dirEntry of dirEntries) {
        if (dirEntry.isFile()) {
            yield path.join(directory, dirEntry.name);
        } else if (dirEntry.isDirectory()) {
            yield* findAllFiles(path.join(directory, dirEntry.name));
        }
    }
}

for await (const file of findAllFiles("/")) {
    if (file.match(/that-file-i-was-looking-for/)) {
        console.log(`Found it!`);
        console.log(file);
        break;
    }
}

To quote

What makes these particularly better than doing it via loading all the entries into an array and returning a Promise for the array, is that they aren't actually all stored at once so if the list is extremely long (such as can be the case for entire file systems) you only actually receive one item at a time, the memory usage is merely the directory depth rather than the number of files.

Another advantage of this approach, is that you don't process what you don't need, when break is used in the above loop the generator function actually just stops. It won't continue past the last file which was yield-ed, and as such we don't wind up searching the entire file system if we find what we need early on.

--- update ---

6 months later I have a better understanding of asynchronous iteration. Now I see that consuming data using for await...of is nicely separated from the asynchronous generator, each on its own can be a straightforward implementation. And I am not always the author of these 2 parts.

BTW, https://www.nodejsdesignpatterns.com/blog/javascript-async-iterators/ gives the most detailed information about async iterator I have read.

Qiulang
  • 10,295
  • 11
  • 80
  • 129