9

I'm using NodeJS with MongoDB and Express. I need to insert records into a collection where email field is mandatory. I'm using insertMany function to insert records. It works fine when unique emails are inserted, but when duplicate emails are entered, the operation breaks abruptly.

I tried using try catch to print the error message, but the execution fails as soon as a duplicate email is inserted. I want the execution to continue and store the duplicates. I want to get the final list of the records inserted/failed.

Error Message:

Unhandled rejection MongoError: E11000 duplicate key error collection: testingdb.gamers index: email_1 dup key: 

Is there any way to handle the errors or is there any other approach apart from insertMany?

Update:

Email is a unique field in my collection.

Anirudh
  • 2,767
  • 5
  • 69
  • 119
  • 1
    In the question you mentioned that _I want the execution to continue and store the duplicates_. So do you want to store duplicates or not? – alexmac Oct 08 '17 at 11:49
  • I used [Async](https://www.npmjs.com/package/async) module to solve my problem. – Anirudh Oct 12 '17 at 12:38

3 Answers3

7

If you want to continue inserting all the non-unique documents rather than stopping on the first error, considering setting the {ordered:false} options to insertMany(), e.g.

db.collection.insertMany( [ , , ... ], { ordered: false } )

According to the docs, unordered operations will continue to process any remaining write operations in the queue but still show your errors in the BulkWriteError.

Nic Cottrell
  • 9,401
  • 7
  • 53
  • 76
0

I can´t make comment, so goes as answer: is you database collection using unique index for this field, or your schema has unique attribute for the field? please share more information about you code.

From MongoDb docs:

"Inserting a duplicate value for any key that is part of a unique index, such as _id, throws an exception. The following attempts to insert a document with a _id value that already exists:"

try {
   db.products.insertMany( [
      { _id: 13, item: "envelopes", qty: 60 },
      { _id: 13, item: "stamps", qty: 110 },
      { _id: 14, item: "packing tape", qty: 38 }
   ] );
} catch (e) {
   print (e);
}

Since _id: 13 already exists, the following exception is thrown:

BulkWriteError({
   "writeErrors" : [
      {
         "index" : 0,
         "code" : 11000,
         "errmsg" : "E11000 duplicate key error collection: restaurant.test index: _id_ dup key: { : 13.0 }",
         "op" : {
            "_id" : 13,
            "item" : "envelopes",
            "qty" : 60
         }
      }
   ], 
(some code omitted)

Hope it helps.

dpetrini
  • 1,169
  • 1
  • 12
  • 25
0

Since you know that the error is occurring due to duplicate key insertions, you can separate the initial array of objects into two parts. One with unique keys and the other with duplicates. This way you have a list of duplicates you can manipulate and a list of originals to insert.

let a = [
  {'email': 'dude@gmail.com', 'dude': 4},
  {'email': 'dude@yahoo.com', 'dude': 2}, 
  {'email': 'dude@hotmail.com', 'dude': 2}, 
  {'email': 'dude@gmail.com', 'dude': 1}
];

let i = a.reduce((i, j) => {
   i.original.map(o => o.email).indexOf(j.email) == -1? i.original.push(j): i.duplicates.push(j);
   return i;
}, {'original': [], 'duplicates': []});

console.log(i);

EDIT: I just realised that this wont work if the keys are already present in the DB. So you should probably not use this answer. But Ill just leave it here as a reference for someone else who may think along the same lines. Nic Cottrell's answer is right.

TheChetan
  • 4,440
  • 3
  • 32
  • 41