I am giving an answer by just quoting words from @Bergi's comments and from the comments I got from https://github.com/tc39/proposal-async-iteration/issues/135. Without an answer my question will be auto-deleted because it has got down-voted.
@Bergi comments from his answer https://stackoverflow.com/a/60707105/301513
for await … of is useful where the sequence itself is generated
asynchronously (and where its length is not known beforehand to the
consumer). A perfect example is paginated fetching of a list from an
api. Less perfect would be reading files from a large directory -
while one could write that code in two steps (getting an array of
filenames, then iterating that), sometimes you prefer an api surface
that has them packed in one: an asynchronous iterator where you always
get filename and contents together.
https://github.com/tc39/proposal-async-iteration/issues/135 gives an example for reading files from a large directory,
import fs from "fs/promises";
async function* findAllFiles(directory: string) {
const dirEntries = await fs.readdir(directory, { withFileTypes: true });
for (const dirEntry of dirEntries) {
if (dirEntry.isFile()) {
yield path.join(directory, dirEntry.name);
} else if (dirEntry.isDirectory()) {
yield* findAllFiles(path.join(directory, dirEntry.name));
}
}
}
for await (const file of findAllFiles("/")) {
if (file.match(/that-file-i-was-looking-for/)) {
console.log(`Found it!`);
console.log(file);
break;
}
}
To quote
What makes these particularly better than doing it via loading all the
entries into an array and returning a Promise for the array, is that
they aren't actually all stored at once so if the list is extremely
long (such as can be the case for entire file systems) you only
actually receive one item at a time, the memory usage is merely the
directory depth rather than the number of files.
Another advantage of this approach, is that you don't process what you
don't need, when break is used in the above loop the generator
function actually just stops. It won't continue past the last file
which was yield-ed, and as such we don't wind up searching the entire
file system if we find what we need early on.
--- update ---
6 months later I have a better understanding of asynchronous iteration. Now I see that consuming data using for await...of
is nicely separated from the asynchronous generator, each on its own can be a straightforward implementation. And I am not always the author of these 2 parts.
BTW, https://www.nodejsdesignpatterns.com/blog/javascript-async-iterators/ gives the most detailed information about async iterator I have read.