0

I have a django view function representing a redemption process of a prize. I'm trying to eliminate the possible race condition when setting a redemption limit based on the count of a linked model (Redemption). At a high level my view function looks like this:

def redeem(request, prize_id):
    # Get a prize model that contains a limit attribute
    prize = Prize.objects.get(id=prize_id)

    # Check the count of a set of another model representing redemption objects 
    if (prize.redemption_set.count() >= prize.redemption_limit):
        return error_page("Reached redemption limit")
    else:
        # Run some API calls that redeem the prize
        # Create a redemption object in the DB
        redemption = Redemption(prize=prize)
        redemption.save()

So my main concern is when concurrent requests come in I can see the possibility of extra redemptions occurring if the count isn't updated by the time another request comes in. I was looking at select_for_update() and atomic requests but I don't want errors to occur if the model is locked. I mainly want to make sure the redeem calls enter a queue and are rejected appropriately when reaching the redemption limit. I'll have multiple web workers and my DB is mySQL.

Thanks for any tips!

degenTy
  • 340
  • 1
  • 9

1 Answers1

0

This is a great question.

You can use select_for_update which was introduced in Django 1.4 and does a row level locking of the database. All objects in the queryset will be "locked" until the end of the transaction block, meaning that other transactions will be prevented from changing or acquiring locks on them.

As per this question that was asked previously: https://stackoverflow.com/a/12532515/8874154

And by doing this, you can ensure that you don't process multiple prizes.

Swift
  • 1,663
  • 1
  • 10
  • 21