4

I have an array that contains an array of promises, and each inner array could have either 4k, 2k or 500 promises.

In total there are around 60k promises and I may test it with other values as well.

Now I need to execute the Promise.all(BigArray[0]).

Once the first inner array is done, I need to execute the next Promise.all(BigArray[1]) and so on and so on.

If I try to execute a Promise.all(BigArray) its throwing:

fatal error call_and_retry_2 allocation failed - process out of memory

I need to execute it each of promises in series, not in parallel which I think that’s what Node its doing. I shouldn't use new libs however am willing to consider the answer!.

Edit:

Here is an example piece of code:

function getInfoForEveryInnerArgument(InnerArray) {
    const CPTPromises = _.map(InnerArray, (argument) => getDBInfo(argument));
    return Promise.all(CPTPromises)
        .then((results) => {
            return doSomethingWithResults(results);
        });
}
function mainFunction() {
    BigArray = [[argument1, argument2, argument3, argument4], [argument5, argument6, argument7, argument8], ....];
    //the summ of all arguments is over 60k...
    const promiseArrayCombination = _.map(BigArray, (InnerArray, key) => getInfoForEveryInnerArgument(InnerArray));

    Promise.all(promiseArrayCombination).then((fullResults) => {
        console.log(fullResults);
        return fullResults;
    })
}
Rodrigo Zurek
  • 4,555
  • 7
  • 33
  • 45
  • 5
    Trying to track the state of 60k promises sounds like a nightmare, and I'm not surprised you're running out of memory. Sounds like you need to break your problem down further, or rethink the architecture. – Heretic Monkey May 12 '16 at 20:29
  • 2
    I agree with @MikeMcCaughan. 60k Promises sounds unreasonable, there is probably a better solution to your problem. – nils May 12 '16 at 20:43
  • This doesn't make a whole lot of sense. If you have a giant array of arrays of promises, that means your operations have all been launched already. So, there's no execution in series happening here. They are already executing in parallel. If you just want to know when ALL the promises are done, then please say so since that sounds like the real issue. Now, there's little point in launching 60k async operations at the same time in node.js so that's probably your real problem. I think you need to back up and show us the code that creates 60k promises. That's where the issue is. – jfriend00 May 12 '16 at 20:57
  • Voting to close as ***"unclear what you're asking"*** because you can't serialize 60k promises since if you already have a promise for the 60k async operations, then they are already running in parallel and you can't serialize something that is already running in parallel. – jfriend00 May 12 '16 at 21:54
  • 2
    It's quite clear what he is asking; he wants a way that he can perform his promises in series, as opposed to parallel, like `Promise.all ()`. – Evan Bechtol May 12 '16 at 22:15
  • @EvanBechtol - The solution to that problem would be way back before he created 60k promises. The OP needs to show that code and THEN maybe we could help. Once 60k promises have been created (and thus 60k async operations already started), it's too late to fix the problem. – jfriend00 May 13 '16 at 00:57
  • As I explained in my question, each innerArray has from 500 to 4k promises, Nodejs has no problems solving that amount of promises, What I need is to promise.all each inner array in series. That means, solve the 500 - 4k promises once first innerArray is solved, move on to the next innerArray, and so on. – Rodrigo Zurek May 13 '16 at 14:14
  • Are you OK with all 60k operations being run in parallel, but you just want to process the results serially (one sub-array at a time)? Or, do you actually need the operations that each sub-array represents to be run serially and processed serially? The latter would be much safer from a memory and resource point of view in node.js. – jfriend00 May 13 '16 at 14:27

5 Answers5

6

Promise.all will not work, you could use Array.reduce to process BigArray elements, one by one:

BigArray.reduce((promiseChain, currentArray) => {
    return promiseChain.then(chainResults =>
        Promise.all(currentArray).then(currentResult =>
            [...chainResults, currentResult]
        )
    );
}, Promise.resolve([])).then(arrayOfArraysOfResults => {
    // Do something with all results
});
greuze
  • 4,250
  • 5
  • 43
  • 62
3

Pretty simple to accomplish with async/await in ES2017:

(async () => {
    for (let i = 0; i < BigArray.length; i++) {
        await Promise.all(BigArray(i));
    }
})();
maksim
  • 43
  • 5
  • 1
    You're creating orphaned promises. You have no way of referencing them in the parent promise chain – Robert Mennell Aug 29 '18 at 04:59
  • @RobertMennell thanks for bringing this up. Is the issue you're raising that uncaught exceptions will not be properly handled with this approach? I was trying to figure out the right way to do that; seems like the https://stackoverflow.com/a/30378082/3352978 might be a workable solution. – maksim Aug 30 '18 at 22:54
1

Promise.all() is going to check each of your promise results that are passed in as arguments in parallel, and will reject upon the first error, or resolve upon completion of all the promises.

From the MDN:

Promise.all passes an array of values from all the promises in the iterable object that it was passed. The array of values maintains the order of the original iterable object, not the order that the promises were resolved in. If something passed in the iterable array is not a promise, it's converted to one by Promise.resolve.

If any of the passed in promises rejects, the all Promise immediately rejects with the value of the promise that rejected, discarding all the other promises whether or not they have resolved. If an empty array is passed, then this method resolves immediately.

If you need to execute all of your promises in series, then the Promise.all() method will not work for your application. Instead, you need to find an iterative approach to resolving your promises. This is going to be difficult; node.js is asynchronous in nature, and using loops (to my knowledge and experience), will not block until a response is received from a promise within a loop.

Edit:

A library exists called promise-series-node, which I think may help you out quite a bit here. Since you already have the promises created, you could just pass it your BigArray:

promiseSeries(BigArray).then( (results) => {
   console.log(results);
});

In my personal opinion, your approach of starting with 60k+ promises will not only take a substantial amount of time, but also resources on the system executing them (which is why you are running out of memory). I think that you may want to consider a better architecture for the application.

Edit2, What is a promise?::

A promise represents the result of an asynchronous operation, which can take one of three states:

  1. Pending: The start state of the promise
  2. Fulfilled: State of promise represented by a successful operation
  3. Rejected: State of promise represented by an failed operation

Promises are immutable once they are in fulfilled, or rejected states. You can chain promises (great for avoiding repeated callbacks), as well as nest them (when closure is a concern). There are many great articles on the web for this, here is one I found to be informative.

Evan Bechtol
  • 2,855
  • 2
  • 18
  • 36
  • 2
    You don't "execute promises". A promise represents an async operation that has ALREADY been started. You can't serialize 60k async operations that have already been started. They are already running in parallel. So, it isn't clear how this helps at all. – jfriend00 May 12 '16 at 21:36
  • 1
    @jfriend00 Judging by all of your comments on this post, nothing here makes sense to you. Maybe you could post an answer and clarify everything. You say promises don't execute, but in your comment to op you say they execute. – Evan Bechtol May 12 '16 at 21:41
  • 1
    I can't provide an answer because the question doesn't make sense. The real answer is to probably not fire up 60k async operations at once, but the OP doesn't show us ANY of that code so there's not much we can do to help with the real problem. – jfriend00 May 12 '16 at 21:53
  • 1
    You don't seem to understand that if the OP already has arrays of promises, those async operations are already executing in parallel. You don't execute promises. A promise is a representation of a future result for an operation that has already been started. All you can do at the point is monitor when they are all done. There is probably no case where you ever want 60k async operations all trying to run at the same time since node.js won't handle that very well so the real solution here is likely to back up several steps and stop launching 60k async operations at once. – jfriend00 May 12 '16 at 21:58
  • 1
    I know very well how promises work, you are arguing over verbage. If you read my post, you'll notice that I gave him the advice of changing his approach – Evan Bechtol May 12 '16 at 22:11
  • Well, your first sentence is just wrong. `Promise.all()` does not execute anything. You pass it an array of promises that represent async operations that are **already running**. In fact, there is nothing we can do at all to change the order in which the array of promises executes when just given the array because that has already been decided. You may think this is verbage - I think it's an incorrect explanation of what is going on. – jfriend00 May 13 '16 at 00:34
  • Then, your first edit implies that promise-series-node is somehow going to help with this, but if there's already 60k async operations in flight, that's the real problem and promise-series-node called on BigArray isn't going to fix that either. Your last paragraph is the only part of the answer that really points to a possible answer. Personally, I think we should just close the question because it's unclear and the OP has not clarified it or offered the info we actually need in order to help in 4 hours. – jfriend00 May 13 '16 at 00:36
  • I have voted to close it. But, it takes 5 votes to actually close it. – jfriend00 May 13 '16 at 00:50
  • @EvanBechtol: Even if we are arguing over verbage only, you should not use the terminology "execute a promise". That implies a "promise" could be started, stopped, resumed, retried or whatever, and people who are not fluent as us will get a wrong image of how they work. – Bergi May 13 '16 at 01:04
  • 1
    @jfriend00 Please see that I have included an update to address your concerns on terminology and replaced the word "executed". – Evan Bechtol May 13 '16 at 11:45
  • @Bergi Please see above comment – Evan Bechtol May 13 '16 at 11:45
  • Better terminology now. I don't see how your answer does what the OP is asking though. Your recommendation of `promiseSeries()` isn't going to help here. That accepts an array of functions and executes them serially. That isn't what the OP has in their array. It appears they first have data in an array of arrays and then they have promises in an array of arrays. – jfriend00 May 13 '16 at 14:33
0

The promise library bluebird offers a helper method called Promise.map, which takes an array or a promise of an array as its first argument and maps all its elements to a result array, which in turn also gets promisified. Maybe you could try something like this:

return Promise.map(BigArray, function(innerArray) {
  return Promise.all(innerArray);
})
.then(function(finishedArray) {
  // code after all 60k promises have been resolved comes here
  console.log("done");
});

But as already stated before, this is still a very resource intense task that may consume all available memory.

SaSchu
  • 486
  • 5
  • 15
  • The OP said they have an array of array of promises. It's not clear to me how `Promise.map()` would help with that. – jfriend00 May 12 '16 at 21:35
  • 1
    `map()` is going to resolve the promises in parallel but Bluebird's `mapSeries()` function is a decent solution to the OP's problem. http://bluebirdjs.com/docs/api/promise.mapseries.html – Molomby Aug 01 '17 at 01:36
  • @Molomby Yes right, actually this was my intention, I was just wrong about map(). I thought it would be executed serially, but instead mapSeries() does the job. Thanks for pointing this out. – SaSchu Aug 01 '17 at 10:28
0

good answer here Callback after all asynchronous forEach callbacks are completed

function asyncFunction (item, cb) {
  setTimeout(() => {
    console.log('done with', item);
    cb(item*10);
  }, 1000*item);
}



let requests = [1,2,3].map((item) => {
    return new Promise((resolve) => {
      asyncFunction(item, resolve);
    });
})

Promise.all(requests).then(
//  () => console.log('done')
    function(arr){
        console.log(arr)
         console.log('done')
    }
    );
Community
  • 1
  • 1
zloctb
  • 10,592
  • 8
  • 70
  • 89