76

Finally async/await will be supported in all major browser soon except IE. So now we can start writing more readable code with async/await but there is a catch. A lot of people use async await like this:

const userResponse = await fetchUserAsync();
const postsResponse = await fetchPostsAsync();

While this code is readable it has a problem, it runs the functions in series, it won't start fetching posts until the fetching of the user is finished. The solutions is simple, we need to fetch the resources in parallel.

So what I want to do is (in pseudo language):

fn task() {
  result-1 = doAsync();
  result-2 = doAsync();
  result-n = doLongAsync();

  // handle results together
  combinedResult = handleResults(result-1, result-2);

  lastResult = handleLastResult(result-n);
}
NoNameProvided
  • 8,608
  • 9
  • 40
  • 68

7 Answers7

153

You can write something like this:

const responses = await Promise.all([
 fetchUserAsync(),
 fetchPostsAsync(),
]);

const userResponse = responses[0];
const postsResponse = responses[1];

This is easy right? But there is a catch. Promise.all has fail-fast behaviour which means, it will reject as soon as one of the promises rejected. Probably you want a more robust solution where we are in charge of handling the rejections any of the fetches. Luckily there is a solution, it can be achieved simply with async/await without the need of using Promise.all. A working example:

console.clear();

function wait(ms, data) {
  return new Promise( resolve => setTimeout(resolve.bind(this, data), ms) );
}

/** 
 * This will run in series, because 
 * we call a function and immediately wait for it's result, 
 * so this will finish in 1s.
 */
async function series() {
  return {
    result1: await wait(500, 'seriesTask1'),
    result2: await wait(500, 'seriesTask2'),
  }
}

/** 
 * While here we call the functions first,
 * then wait for the result later, so 
 * this will finish in 500ms.
 */
async function parallel() {
  const task1 = wait(500, 'parallelTask1');
  const task2 = wait(500, 'parallelTask2');

  return {
    result1: await task1,
    result2: await task2,
  }
}

async function taskRunner(fn, label) {
  const startTime = performance.now();
  console.log(`Task ${label} starting...`);
  let result = await fn();
  console.log(`Task ${label} finished in ${ Number.parseInt(performance.now() - startTime) } miliseconds with,`, result);
}

void taskRunner(series, 'series');
void taskRunner(parallel, 'parallel');


/* 
 * The result will be:
 * Task series starting...
 * Task parallel starting...
 * Task parallel finished in 500 milliseconds with, { "result1": "parallelTask1", "result2": "parallelTask2" }
 * Task series finished in 1001 milliseconds with, { "result1": "seriesTask1", "result2": "seriesTask2" }
 */

Note: You will need a browser which has async/await enabled to run this snippet (or nodejs v7 and above)

This way you can use simply try/ catch to handle your errors, and return partial results inside the parallel function.

NoNameProvided
  • 8,608
  • 9
  • 40
  • 68
  • 1
    I have some questions about your code. How can you "compose" an await parallel pool. I mean in your example, you know you have two tasks to execute and return result. How can I for example compose using a for loop (lets say I don't know the task count at moment of write script). My use case: I retrieve some ids from an HTTP call then for each id I have a task to run. How to run all the tasks for the retrieved id in parallel? – BlackHoleGalaxy Nov 24 '17 at 18:02
  • Thanks, good question! I will update my answer when I am in front of my Mac. – NoNameProvided Nov 24 '17 at 18:04
  • Nice :) I think it could be a common pattern. For example here is the snippet I would like to execute as a parallel pool and retrieve one and only result. `for (let i = 0; i < serviceList.length; i++) { let result = await this.doSomething(stackServicesList[i].id); };` I tried to create a `let result = []` then push some await inside of it with no success. – BlackHoleGalaxy Nov 24 '17 at 18:12
  • 2
    Note, to run this example in Node.js you'll need to require the (currently experimental) [Performance Timing API](https://nodejs.org/api/perf_hooks.html#perf_hooks_performance_timing_api). Eg.. add this to the top of the example `const { performance } = require('perf_hooks');` – Molomby Dec 20 '17 at 03:56
  • 3
    both `series` and `parallel` have a "fail fast" behaviour too. the answer text is misleading. – Mulan Jan 21 '19 at 02:38
  • 7
    This is a bad practice. Never use multiple await for two or more async parallel tasks because you will not be able to seriously handle errors. It works only for positive scenario but in negative scenario (async tasks rejection) you will always end with unhandled errors although you use try/catch. You must always use Promise.all for async parallel tasks. See my answer here : https://stackoverflow.com/a/54291660/3826175 If you need to handle errors globally and separately use this: `try { let [val1, val2] = await Promise.all([ task1().catch(e => ...), task2().catch(e => ...) ]); } catch(e) { }` – mikep Jan 28 '19 at 14:27
25

If you're ok with the fail-fast behavior of Promise.all and the destructuring assignment syntax:

const [userResponse, postsResponse] = await Promise.all([
  fetchUserAsync(),
  fetchPostsAsync(),
]);
ricka
  • 1,107
  • 1
  • 11
  • 13
  • Chrome gives me `?page=home:36 Uncaught SyntaxError: await is only valid in async function` when trying this method.. But maybe wrong usecase – Angry 84 Oct 14 '18 at 05:40
  • 3
    @Mayhem you need to wrap that code in an async function – ricka Oct 15 '18 at 18:07
5

For those asking how you extend this to a run-time determined number of calls, you can use 2 loops. The first starts all the tasks, the second waits for everything to finish

console.clear();

function wait(ms, data) {
  return new Promise( resolve => setTimeout(resolve.bind(this, data), ms) );
}

/** 
 * While here we call the functions first,
 * then wait for the result later, so 
 * this will finish in 500ms.
 */
async function runTasks(timings) {
  let tasks = [];
  for (let i in timings) {
      tasks.push(wait(timings[i], `Result of task ${i}`));
  }

  /* Want fast fail? use Promise.All */
  //return Promise.All(tasks);
  
  let results = [];
  for (let task of tasks) {
       results.push(await task);
  }

  return results;
}

async function taskRunner(fn, arg, label) {
  const startTime = performance.now();
  console.log(`Task ${label} starting...`);
  let result = await fn(arg);
  console.log(`Task ${label} finished in ${ Number.parseInt(performance.now() - startTime) } miliseconds with,`, result);
}

void taskRunner(runTasks, [50,100,200,60,500], 'Task List');
Wilco
  • 974
  • 8
  • 14
1

I actually just did this same thing. By using promises and then Promise.all to synchronize them at the end, you can do many concurrent requests, but then be sure you have all the results back before you finish.

See here in the last example: http://javascriptrambling.blogspot.com/2017/04/to-promised-land-with-asyncawait-and.html

Kevin Williams
  • 224
  • 3
  • 5
  • As I said, Promise.all has fast-fail behavior, so you are not archiving the same goal. – NoNameProvided Apr 04 '17 at 16:30
  • Ah, didn't realize you had asked your own question so you could answer it. But in the original question you didn't mention that you wanted to handle every error. Usually it is sufficient to handle just one error as it is rare for a process to be successful if even one part fails. Unless it is needed, less code is better. – Kevin Williams Apr 04 '17 at 16:34
1

The pseudo code can be written as below:

fn async task() {
  result-1 = doAsync();
  result-2 = doAsync();
  result-n = doLongAsync();
  try{
  // handle results together
  combinedResult = handleResults(await result-1, await result-2);
  lastResult = handleLastResult(await result-n);
  }
  catch(err){
   console.error(err)
  }

}

result-1, result-2, result-n will run in parallel. combinedResult and lastResult will also run in parallel. However combinedResult value i.e. return of handleResults function will be returned once the result-1 and result-2 are available and lastResult value i.e handleLastResult will be returned once the result-n is available.

Hope this helps

Nagaraja Malla
  • 144
  • 1
  • 10
  • you may refer to this link for further understanding: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/async_function – Nagaraja Malla Nov 20 '17 at 06:26
1

First, are your code a blocking-code?

If yes, remember that javascript is single thread, so you cannot run two synchronous codes, for example two loops (for or while) at the same time.

But, it is possible to achieve that using Web Workers, I managed to execute functions in generic web workers and without using separated js files.

setInterval(()=>{console.log("non blocked " + Math.random())}, 900)

console.log("start blocking code in parallel in web Worker")
console.time("blocked")

genericWorker(window, ["blockCpu", function (block){    
    block(10000) //This blockCpu function is defined below
    return "\n\nbla bla\n" //This is catched in the resolved promise

}]).then(function (result){
    console.timeEnd("blocked")
    console.log("End of blocking code", result)
})
.catch(function(error) { console.log(error) })


/*  A Web Worker that does not use a File, it create that from a Blob
    @cb_context, The context where the callback functions arguments are, ex: window
    @cb, ["fn_name1", "fn_name2", function (fn1, fn2) {}]
        The callback will be executed, and you can pass other functions to that cb
*/
function genericWorker(cb_context, cb) {
    return new Promise(function (resolve, reject) {

        if (!cb || !Array.isArray(cb))
            return reject("Invalid data")

        var callback = cb.pop()
        var functions = cb

        if (typeof callback != "function" || functions.some((fn)=>{return typeof cb_context[fn] != "function"}))
            return reject(`The callback or some of the parameters: (${functions.toString()}) are not functions`)

        if (functions.length>0 && !cb_context)
            return reject("context is undefined")

        callback = fn_string(callback) //Callback to be executed
        functions = functions.map((fn_name)=> { return fn_string( cb_context[fn_name] ) })

        var worker_file = window.URL.createObjectURL( new Blob(["self.addEventListener('message', function(e) { var bb = {}; var args = []; for (fn of e.data.functions) { bb[fn.name] = new Function(fn.args, fn.body); args.push(fn.name)}; var callback = new Function( e.data.callback.args, e.data.callback.body); args = args.map(function(fn_name) { return bb[fn_name] });  var result = callback.apply(null, args) ;self.postMessage( result );}, false)"]) )
        var worker = new Worker(worker_file)

        worker.postMessage({ callback: callback, functions: functions })

        worker.addEventListener('error', function(error){ return reject(error.message) })

        worker.addEventListener('message', function(e) {
            resolve(e.data), worker.terminate()
        }, false)

        //From function to string, with its name, arguments and its body
        function fn_string (fn) {
            var name = fn.name, fn = fn.toString()

            return { name: name, 
                args: fn.substring(fn.indexOf("(") + 1, fn.indexOf(")")),
                body: fn.substring(fn.indexOf("{") + 1, fn.lastIndexOf("}"))
            }
        }
    })
}

//random blocking function
function blockCpu(ms) {
    var now = new Date().getTime(), result = 0
    while(true) {
        result += Math.random() * Math.random();
        if (new Date().getTime() > now +ms)
            return;
    }   
}
Fernando Carvajal
  • 1,869
  • 20
  • 19
  • Upvoted. I appreciated your first sentence note regarding the fact any "theory" wears off if the "async" function in fact wraps behind a synchronous code (cpu-bound calculations, strict loops, etc.). That's true also in nodejs; in this case now in 2021 you can use worker threads instead of webWorkers. Thx. – Giorgio Robino Apr 27 '21 at 08:21
0

The selected answer proposes 2 ways both waiting termination of all "spawned" async functions.

My proposal is instead to spawn each asynch function using setImmediate (nodejs ~equivalent to setTimeout(0)) to run each function and get intermediate results before teh completions of all functions:

for (let i = 0; i < numSpawns; i++ ) {

  // nodejs
  setImmediate( async () => { console.log( await runAsyncFunction(msecsMax) ) } )

  // browser
  // substitute setImmediate with setTimeout( await runAsyncFunction, 0, msecsmax )

}  

Complete demo code

/**
 * parallel.js
 * demo, to "spawn" in parallel multiple async functions
 */

/**
 * sleep
 * warp setTimeout, returning a value
 *
 * @async
 * @param {Number}  msecs number of milliseconds
 * @return {Number} msecs
 */
function sleep(msecs) {
  return new Promise(function(resolve /*, reject*/) {
     setTimeout( () => { resolve(msecs) }, msecs )
   })
}

/**
 * randomInteger
 * Returns a random integer number between min (inclusive) and max (inclusive)
 * @param {Number}  min
 * @return {Number} max
 */
function randomInteger(min, max) {
  return Math.floor(Math.random() * (max - min + 1)) + min;
}


/**
 * runAsyncFunction
 * simulate an async function, 
 * returning, after a random number of msecs, the number of msecs
 *
 * @async
 * @param {Number}  msecsMax max duration in milliseconds
 * @return {Number} random number of msecs
 */
async function runAsyncFunction(msecsMax) {
  const msecsMin = 500
  return await sleep( randomInteger(msecsMin, msecsMax) )
}



async function parallel(numSpawns, msecsMax) {
  for (let i = 0; i < numSpawns; i++ ) {

    // nodejs
    setImmediate( async () => { console.log( await runAsyncFunction(msecsMax) ) } )

    // browser
    // substitute setImmediate with setTimeout( await runAsyncFunction, 0, msecsmax )
  
  }  
}  


async function main() {

  const msecsMax = 3000
  const numSpawns = 10
  
  // runs in "parallel" 10 async functions, 
  // each one returning after a sleep of a random number of milliseconds (between 500 to 3000)  
  parallel(numSpawns, msecsMax)
}

main()

Run the program:

$ /usr/bin/time --verbose node parallel
1204
1869
1983
2042
2119
2220
2222
2611
2642
2660
    Command being timed: "node parallel"
    User time (seconds): 0.07
    System time (seconds): 0.00
    Percent of CPU this job got: 3%
    Elapsed (wall clock) time (h:mm:ss or m:ss): 0:02.72
    Average shared text size (kbytes): 0
    Average unshared data size (kbytes): 0
    Average stack size (kbytes): 0
    Average total size (kbytes): 0
    Maximum resident set size (kbytes): 31568
    Average resident set size (kbytes): 0
    Major (requiring I/O) page faults: 0
    Minor (reclaiming a frame) page faults: 2160
    Voluntary context switches: 39
    Involuntary context switches: 1
    Swaps: 0
    File system inputs: 0
    File system outputs: 0
    Socket messages sent: 0
    Socket messages received: 0
    Signals delivered: 0
    Page size (bytes): 4096
    Exit status: 0
Giorgio Robino
  • 2,148
  • 6
  • 38
  • 59