0

Currently I am facing a callback hell while trying to execute a number of entries into mongodb with nodejs driver. There are certain tasks that should be performed only if all the records (in array) are inserted. I checked out the async.each method but the main issue with this is that it will invoke the callback if an error is encountered. Hence the rest of the items in collection won't be processed.

So what is the correct way to deal with this kind of problem with async.js?

PS: Please an example would be very helpful as I am just starting out with async.js and there are very less(if at all) examples to use this for batch inserts into mongodb using the api.

Community
  • 1
  • 1
rahulserver
  • 10,411
  • 24
  • 90
  • 164
  • Async is a terrible/obsolete approach. You should use promises instead. And the best way insert multiple records of the same type is via a single multi-row insert. Here's an example for PostgreSQL: https://stackoverflow.com/questions/37300997/multi-row-insert-with-pg-promise. You need to find the same for mongodb. – vitaly-t Jun 26 '17 at 14:45
  • @vitaly-t I think that's just an opinion. Googling for callback hell yields asyncjs as a very popular solution. Currently I would prefer going with async.js – rahulserver Jun 26 '17 at 14:47
  • 2
    @rahulserver so if you have 10 insert to do and at iteration #6 you got an error what should happen? Remove the previous element (#1 to #5)? Continue to next one (#7 to #10)? Or just stop here (#6 to #10 are not inserted) but do not run the code in the final function ()? – Daniele Tassone Jun 27 '17 at 07:29
  • @DanieleTassone preferably collect results in an array that #6 failed and move on to the next one – rahulserver Jun 27 '17 at 10:25
  • 1
    I usually use Promise in order to avoid "crashing" scenario that cause the application to run abnormally. My suggestion is to use Promise based approach or Try/Catch inside the Async function (personally i don't like Try/Catch). For MongoDB used inside Async i usually batch all the insert/update inside a OrderedBulk. Then i commit at the end of Async flow. I don't know if this is type of reply you want. If so, i can post you a complete reply. – Daniele Tassone Jun 27 '17 at 18:54

1 Answers1

1

Let's describe the problem.

  1. You have an iteration based flow with some library (example.: AsyncJS)
  2. Async Flow can be interrupted at some iteration
  3. You want to be sure all the operatiosn in MongoDB are correctly stored or maybe take a future decision if that operation should happen or no

In this case i can suggest you to use Bulk Operation

 // array of operations
 let batch = yourCollection.initializeUnorderedBulkOp();

 async.each (yourArrayToIterate, function (item, next) {
     // add some Operation that will be execute lated
     batch.insert({firstName:"Daniele"});
     batch.find({firstName: "Daniele"}).updateOne({$set: {lastName:"Tassone"}});
     // other code to do...some Old Library
       try {
         someOldLibrary.doSomethingThatCanCreateAnExpection();
      } catch (e) {
         //some error happen
      }
     // more modern approach with New Library
      someNewLibrary
       .then(function (result){
       })
       .catch(function (err) {
           // some error happen
       }

 }, function (err, result) {

      // Flow is now completed, take a decision
     if (err) {
       // do nothing, maybe?
     } else {
        // execute all the Ordered Bulk 
        batch.execute().then(function(executionResult) {
            // done, you can 
        });
     }
 });

Some Note: 1) BulkOperation can be Ordered and UnOrdered. UnOrdered perform better and continue to work even if some error happen 2) Ordered is performed sequentially and if some error happen MongoD will not continue to the next element in the Bulk.

Documentation: http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html#initializeOrderedBulkOp

Daniele Tassone
  • 2,104
  • 2
  • 17
  • 25