0

I have a service method where I request an entity by ID from the database. If the entity has the attribute paid == false, I set it to true and do something. If paid==true it just returns.

@Override
@Transactional(rollbackFor={ServiceException.class})
public void handleIntentSucceeded(PaymentIntent intent) throws ServiceException {
   LOGGER.trace("handleIntentSucceeded({})", intent);

   CreditCharge charge = transactionRepository.findByPaymentIntentId(intent.getId());

   if(charge.getPaid()) {
      return;


   // do some stuff          

   charge.setPaid(true);
   transactionRepository.save(charge);

}

Now if there are multiple requests with the same intent at the same time, this method would no longer be consistent because, for example, the first request receives the charge with paid==false, so it does "some things" and if the second request comes to this method before the first request has saved the charge with paid==true, it would also do "some things" even if the first request already does so. Is this a correct conclusion?

To be sure that only one request can process this method at a time, to avoid "some things" being done multiple times, I could set the Transactional to @Transactional(isolation = Isolation.SERIALIZABLE). This way any request can process this method/transaction only if the request has committed the Transactional before.

Is this the best approach or is there a better way?

Jozott
  • 2,367
  • 2
  • 14
  • 28
  • dont ask for best approaches - it is opinion based and doesnt bring you anywhere. Make it work, and see if it needs imporovement, then imporve it - that is how you learn. Also Transactions are not really about thread safetyeven though they prevent dirty reads and such - take a look here https://stackoverflow.com/questions/6477574/do-database-transactions-prevent-race-conditions#:~:text=TL%2FDR%3A%20Transactions%20do%20not%20inherently%20prevent%20all%20race%20conditions.&text=Transactions%20are%20not%20a%20secret,safe%20from%20all%20concurrency%20effects. – J Asgarov May 11 '21 at 08:52
  • if you want better answers you should have asked "are there any better approaches than this one?". And refer this question for better understanding of the use cases of @Transactional https://stackoverflow.com/questions/1079114/where-does-the-transactional-annotation-belong?rq=1 – Dhyey Vachhani May 11 '21 at 08:53
  • 2
    Define 'consistent'. We don't know what your business requirements are. If you mean that another transaction should not proceed if it knew `Charge.paid` had been updated in the meantime, then you don't need `SERIALIZABLE`, `REPEATABLE_READ` is enough. If you mean that one transaction starting should prevent all the others from doing the same, then you need to lock your entity for update. – crizzis May 11 '21 at 19:17

2 Answers2

2

I would probably look for a way of optimisically locking the record (e.g. using some kind of update counter), so that only the first concurrent transaction changing the paid property would complete successfully.
Any subsequent transaction which was trying to modify the same entity in the meantime would then fail, and their actions done during do some stuff would rollback.

Optimistic vs. Pessimistic locking

edit: REPEATABLE_READ isolation level (as suggested by one of the comments) might also behave similarly to optimistic locking; though this might depend on the implementation

fladdimir
  • 1,230
  • 1
  • 5
  • 12
2

One solution, as already mentioned above is to use OptimisticLocking. However, an OptimisticLockingException will lead to a failed http request. If this is a problem, you can handle the exception.

But in case you are sure, that you will not run multiple instances of the application and there are not big requirements for perfomance, or you simply want to deal with the problem later and until that use a "workaround", you can make the method synchronized (https://www.baeldung.com/java-synchronized). That way, the Java runtime will ensure, that the method cannot be run in parallel.

Tamás Pollák
  • 233
  • 1
  • 8