6

My question is basically a combination of

I'm aware of Promise.allSettled, but I'm failing to find a good way to also limit concurrency.

What I have so far:

Idea 1 using p-limit:

const pLimit = require('p-limit');
const limit = pLimit(10);

let promises = files.map(pair => {
    var formData = {
        'file1': fs.createReadStream(pair[0]),
        'file2': fs.createReadStream(pair[1])
    };
        
    return limit(() => uploadForm(formData));
});
    
(async () => {
    const result = await Promise.allSettled(promises).then(body => {
        body.forEach(value => {
            if(value.status == "rejected")
                file.write(value.reason + '\n---\n');
        });
    });
})();

My problem with this solution is, that I have to create all promises in the first place and in doing so opening two file streams for each promise and I'll hit the limit of open files.

Idea 2 using p-queue: I tried around with generator functions to create and add new promises in the queue.on 'next' event, but I couldn't get it to work properly and this is probably not the right tool for this job.

Idea 3 using a PromisePool: This looked very promising in the beginning. Some of them support a generator function to create the promises for the pool, but I couldn't find one, who explicitly stated to behave like Promise.allSettled.

I implemented es6-promise-pool only to find out that it will stop after the first promise rejection.

Darkproduct
  • 1,062
  • 13
  • 28

2 Answers2

8

It's simple enough to implement it yourself - make an array of functions that, when called, return the Promise. Then implement a limiter function that takes functions from that array and calls them, and once finished, recursively calls the limiter again until the array is empty:

const request = (file) => new Promise((res, rej) => {
  console.log('requesting', file);
  setTimeout(() => {
    if (Math.random() < 0.5) {
      console.log('resolving', file);
      res(file);
    } else {
      console.log('rejecting', file);
      rej(file);
    }
  }, 1000 + Math.random() * 1000);
});
const files = [1, 2, 3, 4, 5, 6];

const makeRequests = files.map(file => () => request(file));
const results = [];
let started = 0;
const recurse = () => {
  const i = started++;
  const makeRequest = makeRequests.shift();
  return !makeRequest ? null : Promise.allSettled([makeRequest()])
    .then(result => {
      results[i] = result[0];
      return recurse();
    })
};
const limit = 2;
Promise.all(Array.from({ length: limit }, recurse))
  .then(() => {
    console.log(results);
  });

If the order of the results doesn't matter, it can be simplified by removing the started and i variables.

CertainPerformance
  • 356,069
  • 52
  • 309
  • 320
1

The accepted answer works more or less like p-limit. You were having issues with p-limit because the streams were declared outside the limit callback.

This would have solved your problem:

let promises = files.map(pair => {  
    return limit(() => uploadForm({
        'file1': fs.createReadStream(pair[0]),
        'file2': fs.createReadStream(pair[1])
    }));
});
Jean-Baptiste Martin
  • 1,399
  • 1
  • 10
  • 19
  • I wanted to do this, but I can't figure out how to use p-limit without changing my entire application to an ES module and changing all my requires to imports. :( – Jason C Jan 27 '22 at 01:59
  • 1
    @JasonC try with an older version of p-limit. Usually the esm addition comes with a new major, try the previous one. – Jean-Baptiste Martin Jan 27 '22 at 10:53