0

I have a Spring boot application where I am doing a lot of inserts by collecting data from kafka. I wish to use saveAll batch insert for increasing performance. But there are few datas that would be considered duplicates and I am updating them whenever catching DataIntegrityViolationException in my code. With batch insert, is there a way to catch this exception for each data that is duplicate and handle that with the do update code?

dev Joshi
  • 305
  • 2
  • 21

1 Answers1

0

You could use the @SQLInsert annotation of Hibernate to change the way the inserts are done. See Hibernate Transactions and Concurrency Using attachDirty (saveOrUpdate) for details.

Christian Beikov
  • 15,141
  • 2
  • 32
  • 58