There are 100 promises in an array and we need to process 5 at a time in JS. how to achieve this? (Asked in Microsoft interview)
Asked
Active
Viewed 556 times
1
-
I think this was a trick question. You already have a problem. The 100 promises in an array represent 100 asynchronous operations that HAVE ALREADY BEEN STARTED. So, if the goal is to run no more than 5 asynchronous operations at a time, you have to back up several steps and not start more than 5 asynchronous operations at a time. I've written code to do this 4 or 5 times, all in answers here. – jfriend00 Feb 17 '20 at 22:36
-
See links to 5 separate implementations that processes an array with asynchronous operations (with no more than N operations in flight at the same time) in the 2nd half of this answer: [Properly batch asynchronous operations with promises](https://stackoverflow.com/questions/59976352/properly-batch-nested-promises-in-node/59976509#59976509). – jfriend00 Feb 17 '20 at 22:41
-
1Define "process 5 at a time". That would have been my first question to the interviewer. – jfriend00 Feb 18 '20 at 00:41
2 Answers
2
Use a pool. There are a number of implementations in JS, such as this one that has a nice looking API:
const PromisePool = require("async-promise-pool");
// concurrency is the only option for PromisePool and enables you to
// choose how many promises will run at once
const pool = new PromisePool({ concurrency: 3 });
// elsewhere add functions to the pool that produce promises. We use
// functions here to prevent the promises from immediately executing.
pool.add(() => thingThatReturnsAPromise());
// you can await pool.all to ensure that all promises in the pool are
// resolved before continuing.
await pool.all();

coreyward
- 77,547
- 20
- 137
- 166
-
This is clearly the correct answer for all practical purposes. In a job interview, which the question came from, the interviewer will probably expect some notes on the implementation under the hood though I'd assume. – TimoStaudinger Feb 17 '20 at 19:00
-
1@Timo Possibly; really depends on the position and interviewer, but in either case the implementation is readily available in the source of the linked library (and it's only 74 sloc) should anybody want to learn more. – coreyward Feb 17 '20 at 19:01
-
1@Timo I think the interviewer actually wants you to create the logic by yourself maybe. Not sure, but I would never use a library for this in an interview. Of course its clean and elegant :) – Rashomon Feb 17 '20 at 19:26
-2
I would use a function to execute promises in sequence instead of parallel. Once done, create an array of groups of 5 to solve in parallel using Promise.all
:
const PROMISES_AMOUNT = 100
const GROUP_AMOUNT = 5
// Function to divide the array in various chuncks of similar size
function chunkArray(myArray, chunk_size){
let index = 0;
let arrayLength = myArray.length;
let tempArray = [];
for (index = 0; index < arrayLength; index += chunk_size) {
myChunk = myArray.slice(index, index+chunk_size);
// Do something if you want with the group
tempArray.push(myChunk);
}
return tempArray;
}
// the promise we will use
function interval(index) {
return new Promise(function(resolve, reject) {
const time = index*100
setTimeout(function() {
console.log(`Waited ${time}!`)
resolve(index);
}, time)
})
};
// Our array of 100 promises
const promises = new Array(PROMISES_AMOUNT).fill(null).map((_, index) => interval(index ))
// The array of 100 promises divided by groups of 5 elements
const groupedPromises = chunkArray(promises, GROUP_AMOUNT).map((promisesGroup) => () => Promise.all(promisesGroup))
// A function to divide an array
function chunkArray(myArray, chunk_size){
var index = 0;
var arrayLength = myArray.length;
var tempArray = [];
for (index = 0; index < arrayLength; index += chunk_size) {
myChunk = myArray.slice(index, index+chunk_size);
// Do something if you want with the group
tempArray.push(myChunk);
}
return tempArray;
}
// A function to execute promises in sequence
const promisesInSequence = (arrayOfTasks) => {
let results = []
return new Promise((resolve, reject) => {
const resolveNext = (arrayOfTasks) => {
// If all tasks are already resolved, return the final array of results
if (arrayOfTasks.length === 0) return resolve(results)
// Extract first promise and solve it
const first = arrayOfTasks.shift()
first().then((res) => {
console.log('Solved a group in parallel: ', res)
results.push(res)
resolveNext(arrayOfTasks)
}).catch((err) => {
reject(err)
})
}
resolveNext(arrayOfTasks)
})
}
promisesInSequence(groupedPromises)
.then((result) => console.log(result))

Rashomon
- 5,962
- 4
- 29
- 67
-
-
I didn't downvote, but an array of 100 promises represents 100 asynchronous operations that have already been started. You're only ever going to be able to process one promise at a time as they finish (JS runs your code single threaded), so there's really no benefit at all to chunk the promises themselves. I think this was probably a trick question and if the real goal is to not have more than 5 asynchronous operation in flight at the same time, then you have to back up a step BEFORE you start 100 operations and create 100 promises so you only start 5 operations at a time. – jfriend00 Feb 17 '20 at 22:54
-
-
@jfriend I see you point and you are right. The first response should be asking for the definition of 5 processes at time. – Rashomon Feb 18 '20 at 06:22