10

Async generators use an internal queue to handle synchronous next, thrown, and return methods calls.

I was trying to construct a situation where this queue is mandatory for the success of the iteration itself. Therefore, I'm looking for some cases where a manual implementation of the async iteration interfaces, without a custom reimplementation of the queue, is not enough.

The following is an example but not so good, because the general time consistency is not maintained but the iteration result is correct at each step:

function aItsFactory() {
    let i = 1;
    return {
        async next() {
            if(i > 5) return Promise.resolve({ value: void 0, done: true });
            const res = await fetch(`https://jsonplaceholder.typicode.com/posts/${i++}`).then(x => x.json());
            return Promise.resolve({ value: res, done: false });
        },
        [Symbol.asyncIterator]() { 
            return this;
        }
    }
}

const ait = aItsFactory();


// general time consistency is lost, because e.g. the fourth call
// is started with the previous three and it could end before the others.

// But the 'i' state is correctly shared so the fifth call
// is correctly requesting the element number five to the source
// and the last call will correctly receive { done: true }

;(async () => {
      ait.next();
      ait.next();
      ait.next();
      ait.next();
      console.log(await ait.next()); // { done: false, value: { userId: 1, id: 5, title: ... } }

      console.log(await ait.next()); // { done: true, value: undefined }
})();

It could be argued that without a proper queue the iteration concept itself will be lost. That is because of the active parallel next calls.

Anyway, I'd like to find some examples, also trivial ones, which make clear that async generators are a better approach for creating well-formed async iterables than a manual implementation of the async iteration interfaces.

------ Edit ------

Let's talk about an improved situation:

function aItsFactory() {
    let i = 1;
    let done = false;

    return {
        async next() {

            if (done) return Promise.resolve({
                done: true,
                value: undefined
            });

            const res = await fetch(`https://jsonplaceholder.typicode.com/posts/${i++}`).then(x => x.json());

            if (Object.keys(res).length === 0) { // the jsonplaceholder source is out of bounds
                done = true;
                return Promise.resolve({
                    done: true,
                    value: undefined
                });
            } else {
                return Promise.resolve({
                    done: false,
                    value: res
                });
            };

        },
        [Symbol.asyncIterator]() {
            return this;
        }
    }
}

const ait = aItsFactory();

// now lot of sync call to 'ait.next'

Here the done resolution is fully asynchronous. From an async iteration perspective, the code is wrong because each next call should be forced to await the outcome of the previous to know if it was the last valid iteration. In such a case, the current next should do nothing, immediately returning Promise.resolve({done:true, value:undefined}). This is only possible thanks to a queue of sync next calls.

But in practice the major risk of going out-of-bounds, calling ait.next() repeatedly, are some useless AJAX request. Don't misunderstand me, I'm not saying that we can turn a blind eye. The point is that each step of the async iteration itself will never be broken.

I'd like to see a situation, not too unrealistic, where the iteration itself could be compromised at each step if all the next calls are not enqueued.

Andrea Simone Costa
  • 1,164
  • 6
  • 16
  • 1
    And ... whats your question? – Jonas Wilms Aug 20 '19 at 17:46
  • 1
    It's clear my question. I'm trying to find some examples where we are forced to use async gens because of their queue. – Andrea Simone Costa Aug 20 '19 at 17:48
  • Didn't you already show such an example? – Bergi Aug 20 '19 at 19:33
  • 1
    no, because going out of bounds synchronously does not have relevant side effects. – Andrea Simone Costa Aug 20 '19 at 20:00
  • 3
    You already demonstrated how going out of bounds can have an unwanted side effect. Surely you can think of another example where this side effect is relevant. – Bergi Aug 20 '19 at 20:49
  • @Bergi You are right, but generally, the examples that have come into my mind are similar to the last one, where the side effects are "not so relevant". Others were too unrealistic or too specific. I have to admit that I was sure I could find more immediate cases of async gens superiority over a manual implementation of the async iteration interfaces. – Andrea Simone Costa Aug 20 '19 at 20:57
  • Let's simply make the http requests relevant by having them trigger the server's rate limiting. They're all sent at once, in contrast to being sent sequentially if the `next()` calls had been queued. There are many other examples of asynchronous functions only working non-concurrently. – Bergi Aug 20 '19 at 21:00
  • The server rate limit sounds too specific, but you are right about asynchronous functions that cannot work concurrently. Thanks for the hint – Andrea Simone Costa Aug 20 '19 at 21:05
  • 2
    "too specific" ... really? Async iterators do have really limited, specific usecases. – Jonas Wilms Aug 20 '19 at 21:08
  • After I finally got your question I think it is actually more difficult than I thought, I retracted my close vote ... :) – Jonas Wilms Aug 20 '19 at 21:40
  • np @jonas :D How I said to bergi, I thought the difference would be generally much more relevant. Here because I've used the words "too specific". Yes, async generators are not so commons like math operations. So it's clear that to show their usefulness we cannot speak about simple stuff. I need to search more :) I'm sure that there is a straightforward pattern that implies the need of the queue...surely the answer is near to pagination/dataset to handle (split/search/merge) etc.. – Andrea Simone Costa Aug 20 '19 at 21:53
  • @t-j-crowder perhaps you may help us here? – briosheje Aug 29 '19 at 15:29
  • @briosheje you can't just ping people that way. – Jonas Wilms Aug 29 '19 at 19:50
  • @JonasWilms Oh. I thought that could work. Nevermind, I just thought he could give an interesting answer, but hey :(. – briosheje Aug 29 '19 at 20:02

1 Answers1

5

The following scenario:

You have a stream of datasets coming in, e.g. from some API. You want to do some heavy calculations on each dataset, thats why you send the dataset to another worker. But sometimes the API might send multiple datasets at once, and you don't want to have a lot of workers running at the same time, instead you want to have a limited number of workers. In that dataset you are searching for a specific result. With async iterators you could write it as:

 const incoming = createSomeAsyncIterator();

  async function processData() {
    let done, value;
    while(!done) {
      ({ done, value } = await incoming.next());
      if(!done) {
        const result = await searchInWorker(value);
        if(result) {
           incoming.return();
           return result;
        }
      }
    }
 }

 // Consume tasks in two workers.
 Promise.race([
   processData(), processData()
 ]).then(gold => /*...*/);

The code above will fail if .next() wouldn't return datasets in order. Then one of the workers might still go on although the search is done already. Or two workers might work on the same dataset.


Or the rate liming example (stolen from Bergi :)):

 async function* rateLimit(limit, time) {
   let count = 0;
   while(true) {
     if(count++ >= limit) {
       await delay(time);
        count = 0;
      }
      yield; // run api call
   }
 }

const userAPIRate = rateLimit(10, 1000);
async function getUser(id) {
  await userAPIRate.next();
  return doCall("/user/", id);
}

Or imagine you want to show a stream of pictures in some form of gallery (in React):

 const images = streamOfImages();

const Image = () => {
  const [image, setImage] = useState(null);
  useEffect((async ( ) => {
     if(image) await delay(10000); // show image at least 10secs
    const { value } = await images.next();
    setImage(value);
  }, [image]);

    return <img src={image || "loading.png"} />;
 };

const Gallery = () => <div>
  <Image /> <Image /> <Image />
 </div>;

And another one, sheduling data onto a worker, so that one process runs at a time:

  const worker = (async function* () {
    let task;
    while(true) task = yield task && await doInWorker(task);
  })();

 worker.next();

 worker.next("task 1").then(taskOne => ...);
 worker.next("task 2").then(taskTwo => ...);
Jonas Wilms
  • 132,000
  • 20
  • 149
  • 151
  • @Jonas sure this is a good example, maybe not so immediate but good, about an async iteration where the order of 'next' calls matter. But you always have to call await in front of incoming.next() to make the whole thing do its job properly. In other words, there are no use cases where one could be tempted to not call await. So the usefulness of the queue passes into the background IMO. – Andrea Simone Costa Aug 20 '19 at 21:18
  • Oh, I do have another one, wait. – Jonas Wilms Aug 20 '19 at 21:19
  • 1
    @andrea there is *no sense in building up an iterator at all if you don't consume its data, aka `await` the results?!* – Jonas Wilms Aug 20 '19 at 21:23
  • 1
    @patrick yeah, I think the main point the OP is asking about is that async iterators are usually consumed with `for await(entry of iter)` and that correctly `await`s every call to `.next()` before calling the next one. Based on that, the spec could've defined that an iterator throws if `.next()` gets called before the previous one settled. Its quite difficult to construct cases were calling `.next()` out of order would be useful. – Jonas Wilms Aug 20 '19 at 21:31
  • @jonas yes there is no sense but it could happen, and async gen helps us to maintain a consistent outcome. I was trying to define the border. About your last example...could you explain better the logic of your destructuring? Promise.all will return a promise wrapping an array with all the IterationResults...each of them could be the one with `done:true` but you are destructuring only the first – Andrea Simone Costa Aug 20 '19 at 21:37
  • @andrea yeah, I admit that that example was bad. – Jonas Wilms Aug 20 '19 at 21:38
  • 1
    @Jonas Ahah thanks for trying again and again to satisfy my whims XD The last example is very close because, without the queue, each new call to `next` will overwrite the previous "instance" of the worker...certainly with unpleasant consequences – Andrea Simone Costa Aug 21 '19 at 06:10