0

I have about 500k records in my MongoDB, and I have to update every document with one flag

My query goes like this:

db.table.updateMany(
  {}, 
  {
      $set: {
        my_custom_flag: 1
      }
   }
);

But there are few documents which are more than 16MB, and I'm not able to update those documents due to MongoDB 16MB limitation.

The above query throws an error, and breaks the entire operation i.e. I cannot update the rest of documents.

Is there any way where I can suppress the errors and continue with the operation ? or I get the document _id of documents which are more than 16MB ?

{
   "message": "Resulting document after update is larger than 16777216",
   "name": "WriteError",
   "code": 17419,
   "index": 0,
   "errmsg": "Resulting document after update is larger than 16777216"
}

EDIT: bulkWrite() when used with ordered: false still throws out same error, where other documents are not updated. I don't understand why this question is marked as duplicate.

P.S: I do not wish to use GridFS

  • Possible duplicate of [how to ignore duplicate documents when using insertMany in mongodb php library?](https://stackoverflow.com/questions/40811237/how-to-ignore-duplicate-documents-when-using-insertmany-in-mongodb-php-library) – Neil Lunn Jun 04 '18 at 11:49
  • [`insertMany()`](https://docs.mongodb.com/manual/reference/method/db.collection.insertMany/) and [`bulkWrite()`](https://docs.mongodb.com/manual/reference/method/db.collection.bulkWrite/) have an `"ordered": false` setting which allows other documents in the "batch" to continue with the update even if some fail. This is common to all language API's. In truth "everything" actually uses the "Bulk API" where the setting actually is. – Neil Lunn Jun 04 '18 at 11:51
  • I tried implementing this with bulkWrite() but still my operation breaks at the 16MB error – praveen.menezes Jun 04 '18 at 13:02
  • Hi, any solution for this? I got similar issue, trying to insert using insert_many and set ordered to False, but still got BSON doc too large error. Otherwise I might go loop through single doc cause it is not a regular operation. – Darren Christopher Jan 22 '19 at 09:12
  • Hi Darren, I never found a solution to this. Instead I created a for loop to first get all the `_id` of that particular collecton. And then write an update command for each `_id` `db.table.bulkWrite([ { updateOne: { filter: {_id: 1}, update: {my_custom_flag: 1} } }, { updateOne: {...} }, {...} ], { ordered: false });` Using `ordered: false` helped me to update without breaking the loop – praveen.menezes Feb 07 '19 at 06:00

0 Answers0