I have a spring batch application that reads and writes into the same table. I have used pagination for reading the items from the table as my data volume is quite high. When I set the chunk size as more than 1 then my pagination number is getting updated wrongly and hence failing to read some items from the table. Any idea?
@Bean
@StepScope
public RepositoryItemReader<DedWorkFlowResponse> dedWorkFlowResponseItemReader() {
return new RepositoryItemReaderBuilder<DedWorkFlowResponse>()
.name("dedWorkFlowResponseItemReader")
.methodName("findByPickedIs")
.arguments(Collections.singletonList(false))
.pageSize(100)
.repository(dedWorkFlowResponseRepository)
.sorts(Collections.singletonMap("id", Sort.Direction.ASC))
.build();
}
public interface DedWorkFlowResponseRepository extends JpaRepository<DedWorkFlowResponse, Long> {
public Page<DedWorkFlowResponse> findByPickedIs(boolean picked,Pageable pageable);
}
@Component
public class Writer implements ItemWriter<DedWorkFlowResponse> {
@Autowired
ResponseRepository responseRepository;
@Autowired
DedWorkFlowResponseRepository dedWorkFlowResponseRepository;
@Override
public void write(List<? extends DedWorkFlowResponse> list) throws Exception {
List<Response> responses = new ArrayList<>();
list.stream().forEach(dedWorkFlowResponse ->{
Response response = new Response();
response.setName(dedWorkFlowResponse.getName());
response.setProcessedDate(new Date(System.currentTimeMillis()));
System.out.println(response.toString());
responses.add(response);
dedWorkFlowResponse.setPicked(true);
dedWorkFlowResponseRepository.saveAndFlush(dedWorkFlowResponse);
} );
responseRepository.saveAll(responses);
}
}
In writer I am updating some value in DedWorkFlowResponse
Table