2

I have am making 400ish requests to a server - and putting each inside promises.

When running all 400 requests in a single promise.all - the system falls over.

I've split my requests into batches of 50 promises (and added them inside a promise.all), and added those all into another promise.all.

How can I run the promises in the batches, and wait for those to be done before moving onto the next?

// attach the other accounts a user has to the wrapper object
// **this is 400+ requests object, that has the requests in it**
// results are promises
const influencerAccounts = wrapper.map(p => addInfluencerAccounts(p));

// split the requests into chunks to stop the server falling over
const chunkedPromises = _.chunk(influencerAccounts, 50);

// promise.all on each chunk of promises/requests
// ????


// ...

return

I've tried looping over the chunked promised arrays (which is an array of promises) and Promise.all(ing) each one - but that's not going to wait from the previous batch to finish before sending the next.

Thanks,

Ollie

Ollie
  • 1,104
  • 7
  • 24
  • 45
  • I guess `createInfluencerWrapper` has the fetch request in it and will make the request so when you set `wraper` you're already made all the requests. If you'd like to throttle the amount of requests then wrap the function that will return a promise in a [throttler](https://stackoverflow.com/a/48001650/1641941) and maybe do not allow to have promises reject in the Promise.all(something.map so [return a fail object](https://stackoverflow.com/a/47678417/1641941) (last code block) and sort out the rejected and successes after. – HMR May 16 '18 at 16:17
  • 1
    You might find it a lot easier to pick up the Bluebird promise library and use the `concurrency` setting with [Bluebird's `Promise.map()`](http://bluebirdjs.com/docs/api/promise.map.html). That will let you easily code your 400 requests, but separately let you specify how many you want "in flight" at the same time to control memory and resource usage while processing them all. Then you won't have to manually chunk them and it will both tell you when they're all done and keep track of all the results for you (in order). – jfriend00 May 16 '18 at 17:01
  • Or, several other implementations you can copy and use [Promise.all consumes all my RAM](https://stackoverflow.com/questions/46654265/promise-all-consumes-all-my-ram/46654592#46654592) or [How to control how many promises access network in parallel](https://stackoverflow.com/questions/41028790/javascript-how-to-control-how-many-promises-access-network-in-parallel/41028877#41028877) or [Loop through an API vith variable URL](https://stackoverflow.com/questions/48842555/loop-through-an-api-get-request-with-variable-url/48844820#48844820). – jfriend00 May 16 '18 at 17:05
  • Also, [Make several requests to an api that can only handle 20 requests at a time](https://stackoverflow.com/questions/33378923/make-several-requests-to-an-api-that-can-only-handle-20-request-a-minute/33379149#33379149) – jfriend00 May 16 '18 at 17:07

1 Answers1

2

You're making a mistake a lot of people make at first: Promise.all doesn't run anything. It just waits for things that are already running. By the time you've broken your influencerAccounts array into chunks, you've probably already overloaded the server, because you're still sending it 400+ requests at the same time.

Instead, chunk the payout array, and then process it in chunks, something along these lines:

const results = [];
const promise =
    _.chunk(payout, 50).reduce(
        (p, chunk) =>
            p.then(chunkResults => {
                results.push(...chunkResults);
                return Promise.all(chunk.map(startRequest)); 
            })
        ,
        Promise.resolve([])
    )
    .then(() => results);

I've used startRequest above instead of createInfluencerWrapper and addInfluencerAccounts because it wasn't clear to me if you'd introduced one or the other in an attempt to make your chunking work. But if not, startRequest is simply addInfluencerAccounts(createInfluencerWrapper(entry)).

That starts a chunk of 50 requests, uses Promise.all to wait for all of them to complete, then starts the next chunk of 50 requests. The "do this then when it's done do that" part comes from the promise reduce idiom, which in its simple form looks like this:

someArray.reduce((p, entry) => p.then(() => doSomethingWith(entry)), Promise.resolve());

It starts with a resolved promise, and hooks a then handler on it to do the next thing, which hooks a then handler on that to do the next thing, etc.


If you don't like closing over results, we can pass it along the reduce chain; here's the first version above doing that:

const promise =
    _.chunk(payout, 50).reduce(
        ({p, results}, chunk) => ({
            p: p.then(chunkResults => {
                results.push(...chunkResults);
                return Promise.all(chunk.map(startRequest)); 
            }),
            results
        }),
        {p: Promise.resolve([]), results: []}
    )
    .then(({results}) => results);
T.J. Crowder
  • 1,031,962
  • 187
  • 1,923
  • 1,875
  • 1
    It's simpler to start the reduction with `Promise.resolve([])`, thereby delivering at each iteration the `results` array *via* a promise rather than alongside a promise. – Roamer-1888 May 16 '18 at 22:53
  • 1
    @Roamer-1888 - ***Doh!*** Thanks! That's just...so obvious in retrospect. :-) – T.J. Crowder May 17 '18 at 07:07
  • 1
    @T.J.Crowder - thank you so much - I was under the impression that promises did nothing unless 'resolved - thanks so much for the distinction. This was incredibly helpful. – Ollie May 17 '18 at 08:16
  • @Ollie - I'm really glad. You're not at all alone in having gotten that impression about them. :-) – T.J. Crowder May 17 '18 at 08:21
  • What I had in mind was `return _.chunk(payout, 50).reduce((p, chunk) => { return p.then(results => { return Promise.all(chunk.map(addInfluencerAccounts)) .then(newResults => results.concat(newResults)); }); }, Promise.resolve([]));`. – Roamer-1888 May 17 '18 at 11:08
  • @Roamer-1888 - That's basically the above, but creating and throwing away a bunch of temporary arrays rather than just adding to the one array. – T.J. Crowder May 17 '18 at 11:44
  • Yes, exactly so, though `.concat()` isn't really the central theme. `.then(newResults => { results.push(...newResults); return results; })` would be more efficient in that regard. – Roamer-1888 May 17 '18 at 13:19