9

Possible Duplicate:
Coordinating parallel execution in node.js

First, hands on pseudo-code:

forEach(arrayelements) {
  asyncQueryFunction(function(qres) {
    //work with query results.
  });
}
// finally, AFTER all callbacks did return:
res.render("myview");

How to do that?

If that wasn't clear enough, I would explain:

I need to do a series of "update" queries (in mongodb, via mongoose), looping over a list of document ids. For each id in my array I will call an asynchronous function which will return query results (I don't need to do anything with them, actually).

I know I have to use .forEach() javascript loop, but how can I execute my "final" callback only when ALL of my asynchronous queries did finish?

I'm already using the excellent async library (https://github.com/caolan/async) for achieving this kind of task when I've got a "limited" series of task to execute. But I don't think I can pass it an array of different functions.

CAN I?

Community
  • 1
  • 1
Fabio B.
  • 9,138
  • 25
  • 105
  • 177
  • This is one method: http://stackoverflow.com/questions/4631774/coordinating-parallel-execution-in-node-js/4631909#4631909 – slebetman Sep 18 '12 at 07:22
  • you can use an async flow control library, "async" being the most popular (https://github.com/caolan/async) – Ege Özcan Sep 18 '12 at 07:35

2 Answers2

9

very simple pattern is to use 'running tasks' counter:

var numRunningQueries = 0
forEach(arrayelements) {
  ++numRunningQueries;
  asyncQueryFunction(function(qres) {
    //work with query results.
    --numRunningQueries;
    if (numRunningQueries === 0) {
       // finally, AFTER all callbacks did return:
       res.render("myview");
    }
  });
}

or, alternatively, use async helper library such as Async.js

Andrey Sidorov
  • 24,905
  • 4
  • 62
  • 75
  • Lets say we have this block of code in some request while processing the async function, lets say we get one more request, in that case var numRunningQueries won't be overwritten to 0 ? I – bana Feb 18 '15 at 00:38
  • it's always incremented before call and decremented at completion. JS in node is single threaded, there can't be race condition here – Andrey Sidorov Feb 18 '15 at 00:42
  • But aync functions run in separate thread right ?So was thinking like what happens when we are done with the array but aysnc function is still processing and more ever its a local variable so I think it ll retain its value. I am over thinking the things I guess. – bana Feb 18 '15 at 01:04
  • 1
    no, they don't. When you "run" multiple async functions in parallel you actually waiting for io in parallel and not "execute" in parallel. That's exactly the goal of event loop - and you only need one thread of execution for that. – Andrey Sidorov Feb 18 '15 at 09:25
2

If I understand correctly, asyncQueryFunction is always the same, as in you're applying the same update to each document.

I use a helper method to callback after saving (just swap for update) multiple mongoose documents (converted from CoffeeScript, so it may not be perfect):

function saveAll(docs, callback) {

  // a count for completed operations, and save all errors
  var count = 0
    , errors = [];

  if (docs.length === 0) {
    return callback();
  } else {
    for (var i = 0; i < docs.length; i++) {

      // instead of save, do an update, or asyncQueryFunction
      docs[i].save(function(err) {

        // increase the count in each individual callback
        count++;

        // save any errors
        if (err != null) {
          errors.push(err);
        }

        // once all the individual operations have completed,
        // callback, including any errors
        if (count === docs.length) {
          return callback(errors);
        }
      });
    }
  }
};

saveAll(arrayElements, function(errors) {
  // finally, AFTER all callbacks did return:
  res.render("myview");
}
zackdever
  • 1,642
  • 1
  • 13
  • 22