300

I would like to clarify this point, as the documentation is not too clear about it;

Q1: Is Promise.all(iterable) processing all promises sequentially or in parallel? Or, more specifically, is it the equivalent of running chained promises like

p1.then(p2).then(p3).then(p4).then(p5)....

or is it some other kind of algorithm where all p1, p2, p3, p4, p5, etc. are being called at the same time (in parallel) and results are returned as soon as all resolve (or one rejects)?

Q2: If Promise.all runs in parallel, is there a convenient way to run an iterable sequencially?

Note: I don't want to use Q, or Bluebird, but all native ES6 specs.

Yanick Rochon
  • 51,409
  • 25
  • 133
  • 214
  • 1
    Are you asking about node (V8) implementation, or about the spec? – Amit Jun 13 '15 at 21:22
  • 1
    I'm pretty sure `Promise.all` executes them in parallel. – royhowie Jun 13 '15 at 21:23
  • @Amit I flagged `node.js` and `io.js` as this is where I'm using it. So, yes, the V8 implementation if you will. – Yanick Rochon Jun 13 '15 at 21:24
  • 14
    Promises cannot "be executed". They start their task when they are being *created* - they represent the results only - and *you* are executing everything in parallel even before passing them to `Promise.all`. – Bergi Jun 13 '15 at 21:27
  • 1
    Promises are executed at the moment of creation. (can be confirmed by running a bit of code). In `new Promise(a).then(b); c();` a is executed first, then c, then b. It isn't Promise.all that runs these promises, it just handles when they resolve. – Mateon1 Jun 13 '15 at 21:31
  • Just for clarification: The only portion of a `Promise` that gets executed (immediately) is the executor, so the function you pass to the `Promise` constructor. If `Promise.all` awaits the resolving of all given `Promise`s (or the rejection of one) it wouldn't make much sense, if these were settled sequentially. –  Jul 10 '16 at 05:48
  • They are executed in the order they were declared cause most javascript environment run single threaded. So declaring p1 before p2 and calling Promise.all([ p2, p1 ]) wouldn't help – tu4n Jan 08 '17 at 00:41

14 Answers14

381

Is Promise.all(iterable) executing all promises?

No, promises cannot "be executed". They start their task when they are being created - they represent the results only - and you are executing everything in parallel even before passing them to Promise.all.

Promise.all does only await multiple promises. It doesn't care in what order they resolve, or whether the computations are running in parallel.

is there a convenient way to run an iterable sequencially?

If you already have your promises, you can't do much but Promise.all([p1, p2, p3, …]) (which does not have a notion of sequence). But if you do have an iterable of asynchronous functions, you can indeed run them sequentially. Basically you need to get from

[fn1, fn2, fn3, …]

to

fn1().then(fn2).then(fn3).then(…)

and the solution to do that is using Array::reduce:

iterable.reduce((p, fn) => p.then(fn), Promise.resolve())
Bergi
  • 630,263
  • 148
  • 957
  • 1,375
  • 3
    In this example, is iterable an array of the functions that return a promise that you want to call? – James Reategui Jan 15 '16 at 23:40
  • @JamesReategui: Yes, exactly, that's what I meant by "*an iterable of asynchronous functions*" – Bergi Jan 16 '16 at 10:34
  • `iterable.reduce((p, fn) => p.then(fn), Promise.resolve());` where can you put your code that runs when the last promise resolves? – SSH This May 31 '16 at 21:48
  • 2
    @SSHThis: It's exactly as the `then` sequence - the return value is the promise for the last `fn` result, and you can chain other callbacks to that. – Bergi May 31 '16 at 21:58
  • i'm struggling to figure out the reduce. Can you give a real code example with the fn1, fn2, fn3 run sequentially on an array of objects? – TimoSolo Sep 08 '16 at 12:44
  • @TimoSolo: `objects.reduce((p, o) => p.then(()=>fn(o)). Promise.resolve())` – Bergi Sep 08 '16 at 12:45
  • If you need to use the value from fn1 in fn2, for instance, then do it by letting `fn1().then(...` return a promise by calling fn2, thus keeping the chain. Like so: `fn1().then( retValFromF1 => {return p2(retValFromF1)}).then(fn3).catch(...` – wojjas Nov 30 '16 at 22:19
  • 1
    @wojjas That's exactly equivalent to `fn1().then(p2).then(fn3).catch(…`? No need to use a function expression. – Bergi Dec 01 '16 at 00:07
  • @Bergi Using a function expression makes it possible to pass variables. Without it there will be no retValFromF1 to pass to p2. Can it be done in some other/better way? The `return` is not needed if there is only one statement in the function as in my example. But as soon as there are more than one a `p2(retValFromF1)` is needed. – wojjas Dec 01 '16 at 09:13
  • 1
    @wojjas Of course the `retValFromF1` is passed into `p2`, that's exactly what `p2` does. Sure, if you want to do more (pass additional variables, call multiple functions, etc) you need to use a function expression, though changing `p2` in the array would be easier – Bergi Dec 01 '16 at 11:22
  • @Bergi Can I say that: `iterable.reduce((p, fn) => p.then(fn), Promise.resolve())` is equivalent to: `[fn1, fn2, fn3].reduce((p, fn) => p.then(fn), Promise.resolve())` ??? – robe007 Aug 07 '18 at 16:17
  • 1
    @robe007 Yes, I meant that `iterable` is the `[fn1, fn2, fn3, …]` array – Bergi Aug 07 '18 at 16:21
  • @Bergi And ... every `fn...` it is a function that returns a promise??? – robe007 Aug 07 '18 at 16:23
  • 1
    @robe007 Yes, they're asynchronous functions. – Bergi Aug 07 '18 at 16:28
  • `iterable.reduce((p, fn) => p.then(fn), Promise.resolve())` gave me a headache in nodeJS and seems to have a different behaviour than doing it in 2 lines, eg : `ps.reduce((p, fn) => { p.then(fn); return Promise.resolve(); });` – HaneTV Mar 14 '19 at 10:08
  • 1
    @HaneTV The comma separates arguments: `iterable.reduce((p, fn) => { return p.then(fn); }, Promise.resolve())` – Bergi Mar 14 '19 at 12:47
176

In parallel

await Promise.all(items.map(async (item) => { 
  await fetchItem(item) 
}))

Advantages: Faster. All iterations will be started even if one fails later on. However, it will "fail fast". Use Promise.allSettled, to complete all iterations in parallel even if some throw. Technically, these are concurrent invocations not in parallel.

In sequence

for (const item of items) {
  await fetchItem(item)
}

Advantages: Variables in the loop can be shared by each iteration. Behaves like normal imperative synchronous code.

david_adler
  • 9,690
  • 6
  • 57
  • 97
  • 23
    Or: `for (const item of items) await fetchItem(item);` – Robert Penner Feb 24 '18 at 21:33
  • 2
    @david_adler In parallel example advantages you said **All iterations will be executed even if one fails**. If I'm not wrong this would still fail fast. To change this behaviour one can do something like: `await Promise.all(items.map(async item => { return await fetchItem(item).catch(e => e) }))` – Taimoor Nov 07 '18 at 09:33
  • @Taimoor yes it does "fail fast" and continue executing code after the Promise.all but all iterations are still executed https://codepen.io/mfbx9da4/pen/BbaaXr – david_adler Feb 25 '19 at 17:12
  • This approach is better, when the `async` function is an API call an you don't want to DDOS the server. You have better control over the individual results and errors thrown in the execution. Even better you can decide on what errors to continue and on what to break the loop. – mandarin Oct 11 '19 at 13:20
  • 1
    Note that javascript isn't actually executing the asynchronous requests in "parallel" using threads since javascript is single threaded. https://developer.mozilla.org/en-US/docs/Web/JavaScript/EventLoop – david_adler Jun 18 '20 at 16:18
  • if you need the results for the parallel version: let results = await Promise.all(items.map(async item => { return await fetchItem(item) })); and re the comment about not being in "parallel", if the requests call an external function such as fetching a file or api call then they ARE truly parallel despite the javascript engine being single threaded. – twhitehead Nov 07 '20 at 08:00
  • @twhitehead yes the engine is multithreaded but ultimately all comes back to javascripts main thread for serialization of the responses. By truly parallel, you mean new threads are spawned for each request in the engine under the hood? Is that how all engines work? Is that part of the spec? Or is that just the only way things could work? Could you point to something in the spec by any chance? – david_adler Nov 27 '20 at 09:38
  • The defined async function with await inside the map method is redundant. – Dmitriy Mozgovoy Feb 11 '21 at 22:05
  • @DmitriyMozgovoy yes it is but it will be more obvious how to extend it for people new to Promise.all – david_adler Apr 29 '21 at 12:16
  • Use `Promise.allSettled` if you want all items to be *executed* (and not just started) regardless if one fails. – david_adler Aug 03 '21 at 09:10
  • 2
    In the parallel example, is the `await` in `await fetchItem(item)` necessary? Why not `await Promise.all(items.map(item => fetchItem(item)))` – kehers Apr 06 '22 at 06:21
  • It's not necessary but I left it in for illustrative purposes as it's easy to see how to modify code inside the inner arrow function. – david_adler Apr 06 '22 at 09:56
  • @david_adler i actually think using the inner await makes it synchronous – matttm Apr 24 '22 at 13:04
  • In what way synchronous? Each inner callback will be executed concurrently with respect to each other. – david_adler Apr 24 '22 at 18:36
  • A note regarding the parallel section, To the best of my knowledge - Its concurrent execution *not parallel*. -> The difference is, That 1 promise will be executed, then the next one will be executed without waiting for the first to resolve. Parallel is when all of the promises are *executed* at the same time, which I think is not possible with promises ,it might be possible with RXJS observable and the fork join operator - (But Im not 100% sure if its really subscribing to each one in parallel or concurrently just like promise.all). – coderrr22 Dec 15 '22 at 10:44
  • Yeah parallel isn't super accurate, concurrent would be more accurate. Any JS code itself is always single threaded and happens in serial. – david_adler Dec 15 '22 at 11:29
50

NodeJS does not run promises in parallel, it runs them concurrently since it’s a single-threaded event loop architecture. There is a possibility to run things in parallel by creating a new child process to take advantage of the multiple core CPU.

Parallel Vs Concurent

In fact, what Promise.all does is, stacking the promises function in the appropriate queue (see event loop architecture) running them concurrently (call P1, P2,...) then waiting for each result, then resolving the Promise.all with all the promises results. Promise.all will fail at the first promise which fails unless you have to manage the rejection yourself.

There is a major difference between parallel and concurrent, the first one will run a different computation in a separate process at exactly the same time and they will progress at their rhythm, while the other one will execute the different computation one after another without waiting for the previous computation to finish and progress at the same time without depending on each other.

Finally, to answer your question, Promise.all will execute neither in parallel nor sequentially but concurrently.

mysl
  • 1,003
  • 2
  • 9
  • 19
Adrien De Peretti
  • 3,342
  • 16
  • 22
  • 7
    This is not right. NodeJS can run things in parallel. NodeJS has a concept of worker thread. By default the number of worker thread is 4. For example, if you use crypto library to hash two values then you can execute them in parallel. Two worker threads will handle the task. Of course, your CPU has to be multi-core to support parallelism. – Shihab May 11 '20 at 17:51
  • 3
    Yeah you right, it’s what i said at the end of the first paragraph, but i talked about child process, of course they can run workers. – Adrien De Peretti May 11 '20 at 17:54
  • 4
    Best answer so far. I was so confused that how a single-threaded architecture like Node.js could run multiple promises in parallel. Thanks alot sir. P.S. I know how worker threads are and how they work but promises are resolved by Node.js event-loop itself and not by using libuv. So the best Node.js could do is to execute them (promises) concurrently. – Muhammad Muzammil Jan 19 '21 at 06:25
14

Bergi's answer got me on the right track using Array.reduce.

However, to actually get the functions returning my promises to execute one after another I had to add some more nesting.

My real use case is an array of files that I need to transfer in order one after another due to limits downstream...

Here is what I ended up with:

getAllFiles().then( (files) => {
    return files.reduce((p, theFile) => {
        return p.then(() => {
            return transferFile(theFile); //function returns a promise
        });
    }, Promise.resolve()).then(()=>{
        console.log("All files transferred");
    });
}).catch((error)=>{
    console.log(error);
});

As previous answers suggest, using:

getAllFiles().then( (files) => {
    return files.reduce((p, theFile) => {
        return p.then(transferFile(theFile));
    }, Promise.resolve()).then(()=>{
        console.log("All files transferred");
    });
}).catch((error)=>{
    console.log(error);
});

didn't wait for the transfer to complete before starting another and also the "All files transferred" text came before even the first file transfer was started.

Not sure what I did wrong, but wanted to share what worked for me.

Edit: Since I wrote this post I now understand why the first version didn't work. then() expects a function returning a promise. So, you should pass in the function name without parentheses! Now, my function wants an argument so then I need to wrap in in a anonymous function taking no argument!

Tomerikoo
  • 18,379
  • 16
  • 47
  • 61
tkarls
  • 3,171
  • 3
  • 27
  • 27
8

You can also process an iterable sequentially with an async function using a recursive function. For example, given an array a to process with asynchronous function someAsyncFunction():

var a = [1, 2, 3, 4, 5, 6]

function someAsyncFunction(n) {
  return new Promise((resolve, reject) => {
    setTimeout(() => {
      console.log("someAsyncFunction: ", n)
      resolve(n)
    }, Math.random() * 1500)
  })
}

//You can run each array sequentially with: 

function sequential(arr, index = 0) {
  if (index >= arr.length) return Promise.resolve()
  return someAsyncFunction(arr[index])
    .then(r => {
      console.log("got value: ", r)
      return sequential(arr, index + 1)
    })
}

sequential(a).then(() => console.log("done"))
Mark
  • 90,562
  • 7
  • 108
  • 148
  • using `array.prototype.reduce` is much better in terms of performance than a recursive function – Mateusz Sowiński Jul 18 '18 at 15:32
  • @MateuszSowiński, there is a 1500ms timeout between each call. Considering that this is doing async calls sequentially, it’s hard to see how that’s relevant even for a very quick async turnaround. – Mark Jul 18 '18 at 15:44
  • Let's say you have to execute 40 of really quick async functions after each other - using recursive functions would clog your memory pretty fast – Mateusz Sowiński Jul 18 '18 at 15:46
  • @MateuszSowiński, that the stack doesn't wind up here...we're returning after each call. Compare that with `reduce` where you have to build the entire `then()` chain in one step and then execute. – Mark Jul 18 '18 at 15:56
  • In the 40th call of the sequential function the first call of the function is still in memory waiting for the chain of sequential functions to return – Mateusz Sowiński Jul 18 '18 at 16:00
  • I'll refer you to this thread if you really want to discuss the nuances of this @MateuszSowiński: https://stackoverflow.com/questions/29925948/building-a-promise-chain-recursively-in-javascript-memory-considerations – Mark Jul 18 '18 at 16:23
  • you could add a third parameter with an accumulator of type array, and return it in the last `Promise.resolve()` - will be in the same format as Promise.all but sequentially – Bernardo Dal Corno Sep 28 '20 at 15:52
  • One thing I dont like about this way is the need to create an "extra" function (`sequential`) for it to work, in comparison to `for` and `reduce` solutions. But it could very much be in a tool like lodash, for example – Bernardo Dal Corno Sep 28 '20 at 15:53
  • for completeness, you should make `a` an array of promises, and then change `someAsyncFunction()` to simply `arr[index]()`, without forgeting to use `await` – Bernardo Dal Corno Sep 28 '20 at 15:56
5

Just to elaborate on @Bergi's answer (which is very succinct, but tricky to understand ;)

This code will run each item in the array and add the next 'then chain' to the end:

function eachorder(prev,order) {
        return prev.then(function() {
          return get_order(order)
            .then(check_order)
            .then(update_order);
        });
    }
orderArray.reduce(eachorder,Promise.resolve());
Tomerikoo
  • 18,379
  • 16
  • 47
  • 61
TimoSolo
  • 7,068
  • 5
  • 34
  • 50
4

Using async await an array of promises can easily be executed sequentially:

let a = [promise1, promise2, promise3];

async function func() {
  for(let i=0; i<a.length; i++){
    await a[i]();
  }  
}

func();

Note: In above implementation, if a promise is rejected, the rest wouldn't be executed.If you want all your promises to be executed, then wrap your await a[i](); inside try catch

Ayan
  • 8,192
  • 4
  • 46
  • 51
3

parallel

see this example

const resolveAfterTimeout = async i => {
  return new Promise(resolve => {
    console.log("CALLED");
    setTimeout(() => {
      resolve("RESOLVED", i);
    }, 5000);
  });
};

const call = async () => {
  const res = await Promise.all([
    resolveAfterTimeout(1),
    resolveAfterTimeout(2),
    resolveAfterTimeout(3),
    resolveAfterTimeout(4),
    resolveAfterTimeout(5),
    resolveAfterTimeout(6)
  ]);
  console.log({ res });
};

call();

by running the code it'll console "CALLED" for all six promises and when they are resolved it will console every 6 responses after timeout at the same time

2

Bergi's answer helped me to make the call synchronous. I have added an example below where we call each function after the previous function is called:

function func1 (param1) {
    console.log("function1 : " + param1);
}
function func2 () {
    console.log("function2");
}
function func3 (param2, param3) {
    console.log("function3 : " + param2 + ", " + param3);
}

function func4 (param4) {
    console.log("function4 : " + param4);
}
param4 = "Kate";

//adding 3 functions to array

a=[
    ()=>func1("Hi"),
    ()=>func2(),
    ()=>func3("Lindsay",param4)
  ];

//adding 4th function

a.push(()=>func4("dad"));

//below does func1().then(func2).then(func3).then(func4)

a.reduce((p, fn) => p.then(fn), Promise.resolve());
Tomerikoo
  • 18,379
  • 16
  • 47
  • 61
Nithi
  • 31
  • 1
2

I stumbled across this page while trying to solve a problem in NodeJS: reassembly of file chunks. Basically: I have an array of filenames. I need to append all those files, in the correct order, to create one large file. I must do this asynchronously.

Node's 'fs' module does provide appendFileSync but I didn't want to block the server during this operation. I wanted to use the fs.promises module and find a way to chain this stuff together. The examples on this page didn't quite work for me because I actually needed two operations: fsPromises.read() to read in the file chunk, and fsPromises.appendFile() to concat to the destination file. Maybe if I was better with JavaScript I could have made the previous answers work for me. ;-)

I stumbled across this and I was able to hack together a working solution:

/**
 * sequentially append a list of files into a specified destination file
 */
exports.append_files = function (destinationFile, arrayOfFilenames) {
    return arrayOfFilenames.reduce((previousPromise, currentFile) => {
        return previousPromise.then(() => {
            return fsPromises.readFile(currentFile).then(fileContents => {
                return fsPromises.appendFile(destinationFile, fileContents);
            });
        });
    }, Promise.resolve());
};

And here's a jasmine unit test for it:

const fsPromises = require('fs').promises;
const fsUtils = require( ... );
const TEMPDIR = 'temp';

describe("test append_files", function() {
    it('append_files should work', async function(done) {
        try {
            // setup: create some files
            await fsPromises.mkdir(TEMPDIR);
            await fsPromises.writeFile(path.join(TEMPDIR, '1'), 'one');
            await fsPromises.writeFile(path.join(TEMPDIR, '2'), 'two');
            await fsPromises.writeFile(path.join(TEMPDIR, '3'), 'three');
            await fsPromises.writeFile(path.join(TEMPDIR, '4'), 'four');
            await fsPromises.writeFile(path.join(TEMPDIR, '5'), 'five');

            const filenameArray = [];
            for (var i=1; i < 6; i++) {
                filenameArray.push(path.join(TEMPDIR, i.toString()));
            }

            const DESTFILE = path.join(TEMPDIR, 'final');
            await fsUtils.append_files(DESTFILE, filenameArray);

            // confirm "final" file exists    
            const fsStat = await fsPromises.stat(DESTFILE);
            expect(fsStat.isFile()).toBeTruthy();

            // confirm content of the "final" file
            const expectedContent = new Buffer('onetwothreefourfive', 'utf8');
            var fileContents = await fsPromises.readFile(DESTFILE);
            expect(fileContents).toEqual(expectedContent);

            done();
        }
        catch (err) {
            fail(err);
        }
        finally {
        }
    });
});
Tomerikoo
  • 18,379
  • 16
  • 47
  • 61
Jay
  • 173
  • 1
  • 7
1

You can do it by for loop.

async function return promise:

async function createClient(client) {
    return await Client.create(client);
}

let clients = [client1, client2, client3];

if you write following code then client are created parallelly:

const createdClientsArray = yield Promise.all(clients.map((client) =>
    createClient(client);
));

But if you want to create client sequentially then you should use for loop:

const createdClientsArray = [];
for(let i = 0; i < clients.length; i++) {
    const createdClient = yield createClient(clients[i]);
    createdClientsArray.push(createdClient);
}
Tomerikoo
  • 18,379
  • 16
  • 47
  • 61
Deepak Sisodiya
  • 850
  • 9
  • 11
  • 9
    At this time, `async`/`await` is only available with a transpiler, or using [other engines](https://kangax.github.io/compat-table/es6/) than Node. Also, you really should not mix `async` with `yield`. Whle they act the same with a transpiler and [`co`](https://github.com/tj/co), they really are quite different and should not ordinarily substitude each other. Also, you should mention these restrictions as your answer is confusing to novice programmers. – Yanick Rochon Feb 25 '16 at 13:48
1

I've been using for of in order to solve sequential promises. I'm not sure if it helps here but this is what I've been doing.

async function run() {
    for (let val of arr) {
        const res = await someQuery(val)
        console.log(val)
    }
}

run().then().catch()
Nick Kotenberg
  • 914
  • 9
  • 8
0

Yes, you can chain an array of promise returning functions as follows (this passes the result of each function to the next). You could of course edit it to pass the same argument (or no arguments) to each function.

function tester1(a) {
  return new Promise(function(done) {
    setTimeout(function() {
      done(a + 1);
    }, 1000);
  })
}

function tester2(a) {
  return new Promise(function(done) {
    setTimeout(function() {
      done(a * 5);
    }, 1000);
  })
}

function promise_chain(args, list, results) {

  return new Promise(function(done, errs) {
    var fn = list.shift();
    if (results === undefined) results = [];
    if (typeof fn === 'function') {
      fn(args).then(function(result) {
        results.push(result);
        console.log(result);
        promise_chain(result, list, results).then(done);
      }, errs);
    } else {
      done(results);
    }

  });
}

promise_chain(0, [tester1, tester2, tester1, tester2, tester2]).then(console.log.bind(console), console.error.bind(console));
Tomerikoo
  • 18,379
  • 16
  • 47
  • 61
cestmoi
  • 11
  • 1
-2

see this sample

Promise.all working parallel

const { range, random, forEach, delay} = require("lodash");  
const run = id => {
    console.log(`Start Task ${id}`);
    let prom = new Promise((resolve, reject) => {
        delay(() => {
            console.log(`Finish Task ${id}`);
            resolve(id);
        }, random(2000, 15000));
    });
    return prom;
}


const exec = () => {
    let proms = []; 
    forEach(range(1,10), (id,index) => {
        proms.push(run(id));
    });
    let allPromis = Promise.all(proms); 
    allPromis.then(
        res => { 
            forEach(res, v => console.log(v));
        }
    );
}

exec();