0

I'm using NodeJS with Express to create a simple rest api. My workflow:

  1. I pass an object via POST URI
  2. I want to save that to the DB. If it is in the DB, I want to increment a count of how many times we've tried to save the object

My code for saving is as such:

await collection.updateOne({
      // match all the properties
    }, {
      $inc: {
        'encountered': 1
      }
    }, {
      upsert: true,
    });

This, however, creates a race condition if I make a number of concurrent calls at once... For example, if I make the POST request 5 times at the same time, I may get two results:

one time, encountered: 3x second time, encountered: 2x

I assume that the first insert searched the DB, didn't find the item, and was prepared for insert, while the second insert got thread time, found no item, and also prepared for insert. Then, the first insert fired, incremented several times, and then second insert fired, and incremented several times.

How do I deal with this? A global variable trying to mimic singleton..? I'm still not sure that'd be thread-safe though... I don't mind worse performance, but I need DB writes to be absolutely thread-safe/atomic, i.e. search and update to be considered an atomic operation.

Edit: Alternatively, I'd be happy not to accept any POST call (i.e. block the POST operation) until my previous call was fully resolved..? I'm not sure what's the proper NodeJS way to solve this.

Edit2: I also tried db.collection.createIndex, which then fails the writes. I am afraid that upon re-trying this failed write, I'd run into the same problem.

halpdoge
  • 642
  • 7
  • 19
  • `upsert` should be atomic - that is the point of using it instead of separate find and then insert. So, while the first query is operating, there's a momentary lock on that collection until the record is either updated or inserted. This is the database's job, not yours by blocking requests. If you use the appropriate features in the database, the database should handle this for you. I'm not an expert on mongodb concurrency at all, but I think the whole point of `$inc` and `upsert` is exactly to solve this concurrency issue inside of the database so you don't have to. – jfriend00 Oct 30 '18 at 22:05
  • Are you just theorizing that you may have a concurrency problem? Or did you actually test and identify a concurrency issue? – jfriend00 Oct 30 '18 at 22:05
  • @jfriend00 I have tested everything locally and it works fine, but as soon as I put it in production, where I can get even 20-30 requests firing at once, I saw multiple identical items being saved in the DB => I came to the conclusion this is a concurrency problem, which I expected coming from Java background. But since there's no 'synchronized' in NodeJS, I'm not sure how to approach this. – halpdoge Oct 30 '18 at 22:09
  • OK, then look for places where you aren't using the built-in concurrency features in mongodb correctly rather than trying to build your own concurrency control on top of the DB. You should be letting the database manage this for you. – jfriend00 Oct 30 '18 at 22:11

0 Answers0