0

Good day,

I have a Spring batch that using different step, the Job is something like follow:

@Bean
  public Job myJob() throws Exception {
    return jobBuilderFactory.get("MyJob").repository(batchConfiguration.jobRepository())
        .start(step1()).next(step2()).build();
  }

In my step1(),it has its own reader, processor, and writer, in this writer, I will update table A.

And then in my step2(), it also has its own reader, processor, and writer. And this reader is read from table A, and by logic, it need to depends on the data update in table A.

However, when I run this batch job, I found that my step2() reader is actually selecting same data as step1(), anyway I can make the step1() writer to commit first, then my step2() reader read the updated data?

Panadol Chong
  • 1,793
  • 13
  • 54
  • 119

1 Answers1

0

If you're using Spring Data, you can annotate your step1 as @Transactional(propagation = Propagation.REQUIRES_NEW), so the operation will always open a new transaction, execute and commit when the method finish. Another way if you are using Spring Data in recent version you can call saveAndFlush().

For hibernate/jdbc legacy you can simply open a connection do the first step and commit before initiate the step2 (session and EntityManager has flush method).

For more info:

https://www.baeldung.com/spring-transactional-propagation-isolation

What does EntityManager.flush do and why do I need to use it?

https://www.tutorialspoint.com/jdbc/commit-rollback.htm