8

I have a function that makes a REST call to a service and returns a promise. lets call that function Execute(). The function takes an ID and sends the ID as a GET parameter to a REST end point which persists the ID in a mongoDB db with some additional info.

In the client, I will need to run "Execute" 100k times from IDs (0 to 100k) and show the status of each (Whether succeeded or failed).

I did the obvious and I created a loop from 0 to 100k and run execute passing "i. That caused my Chrome to eventually freeze running out of memory (insufficient resources). It also caused network congestion from all the rest calls going at the back end.

So I wanted to "chop" those 100k into manageable amount like 50 promises call each. And when those 50 are all done (whether failed or succeeded) I want to use Promise.all([]).then execute the next 50 until all the 100k are done. This way I control the network congestion and memory at the same time. However I can't seem to know how to shake this down. Here is my code.

let promises = []
for (let i = 0; i < 100000, i++)
    {
       promises.push(execute(i))
       if (i % 50 === 0) 
       {
          Promise.all(promises)
          .then  (a => updateStatus (a, true))
          .catch (a => updateStatus (a, false)) 

       }

    }

The asynchronous nature of Javascript will keep executing the rest of the loop and executing. I really don't want to put a timer to hold the loop every 50 iterations because this will block the UI and kind of turned my app synchronous. Any suggestions as to how I tackle this?

Thank You Very Much.

New to Javascript.

Hussein Nasser
  • 402
  • 1
  • 6
  • 11
  • 1
    You should simply put your promises into an array of arrays. That way, you executer first array of 50 promises, then the second array,... – Deblaton Jean-Philippe Jan 19 '18 at 16:41
  • chunk your promises into a new array of arrays then simply process each sub array one at a time, which can easily be done with a reduce. – Kevin B Jan 19 '18 at 16:55
  • The simplest solution would be to put an `await` in front of that `Promise.all` call if your environment supports it… you must not continue synchronously with the loop after having created the 50 executions. – Bergi Jan 19 '18 at 17:46
  • Bluebird's [`Promise.map()`](http://bluebirdjs.com/docs/api-reference.html) offers a concurrency feature that lets you specify how many requests are "in flight" at the same time. This is more of a continuous design than chunking by 50. – jfriend00 Jan 19 '18 at 18:18
  • Duplicate of https://stackoverflow.com/questions/40639432/what-is-the-best-way-to-limit-concurrency-when-using-es6s-promise-all which has numerous suggestions – Ahmed Fasih Sep 22 '22 at 01:48

3 Answers3

4

You can use async/await to perform asynchronous task in sequential order, schedule a call to the same function is the original array of contains elements, else return array of results

let arr = Array.from({
  length: 2000
}, (_, i) => i);

let requests = arr.slice(0);

let results = [];

let fn = async(chunks, results) => {
  let curr;
  try {
    curr = await Promise.all(chunks.map(prop => 
             new Promise(resolve => setTimeout(resolve, 500, prop))));
    results.push(curr);
    console.log(curr);
  } catch(err) {
    throw err
  }

  return curr !== undefined && requests.length 
         ? fn(requests.splice(0, 50), results) 
         : results
}

fn(requests.splice(0, 50), results)
.then(data => console.log(data))
.catch(err => console.error(err))
guest271314
  • 1
  • 15
  • 104
  • 177
  • See also [multiple, sequential fetch() Promise](https://stackoverflow.com/questions/38034574/multiple-sequential-fetch-promise/) – guest271314 Jan 19 '18 at 16:59
  • [jQuery - Can threads/asynchronous be done?](https://stackoverflow.com/questions/26068821/jquery-can-threads-asynchronous-be-done) – guest271314 Jan 19 '18 at 17:15
3

With promises itself, this can be done only with recursion.

If you can use new version of Node.js, use async await, it will work as you expect, then you can use await Promise.all(promises)

If you cant, then there is nice library (called Async) that can execute 50 asynchronous calls at once with this method: https://caolan.github.io/async/v3/docs.html#parallelLimit

It is even better than chunks, because if you have 1 slow callback in chunk, it will block everything else. With parallel limit, it just keep executing 50 callbacks all the time. (however you can just pre-create chunks by 50 if you insist on them and use .series method)

Samuel_NET
  • 345
  • 3
  • 9
libik
  • 22,239
  • 9
  • 44
  • 87
  • Thanks. async await and generator functions did the trick for me. I wanted to "pause" the app for a while to allow stuff to get crunched at the backend but I wanted the app to remain responsive. – Hussein Nasser Jan 22 '18 at 18:10
0

You can wrap promise by function and push it an array.
After splitting the array in chunks and process with reduce.

Npm package.
https://www.npmjs.com/package/concurrency-promise

Roman Rhrn Nesterov
  • 3,538
  • 1
  • 28
  • 16