In my application I have used RepositoryItemReader and FlatFileItemWriter The components are as follows
@Component
@StepScope
public class MyItemReader extends RepositoryItemReader<MyEntity> {
public MyItemReader(MyEntityRepository repository) throws Exception {
super();
this.setRepository(repository);
this.setPageSize(1000);
Map<String, Sort.Direction> sort = new HashMap<>();
sort.put("id", Sort.Direction.ASC);
this.setSort(sort);
this.setMethodName("findAll");
this.afterPropertiesSet();
}
}
@Component
@StepScope
public class MyItemProcessor implements ItemProcessor<MyEntity, String> {
private final MyEntityRepository repository;
public MyItemProcessor(MyEntityRepository repository) {
this.repository = repository;
}
@Override
public String process(MyEntity item) {
// our business logic to convert item into string(line)
// updating the file generated status in the fb before returning
}
}
@Component
@StepScope
public class MyItemWriter extends FlatFileItemWriter<String> {
public MyItemWriter(@Value("#{jobParameters['fileName']}") String fileName) throws Exception {
this.setResource(new FileSystemResource(String.format("batch_files/%s", fileName)));
this.setLineAggregator(new PassThroughLineAggregator<>());
this.afterPropertiesSet();
}
}
@Bean
public Step step(JobRepository jobRepository,
PlatformTransactionManager transactionManager,
MyItemReader myItemReader,
MyItemProcessor myItemProcessor,
MyItemWriter myItemWriter) {
return new StepBuilder("fileGenerationStep", jobRepository)
.<MyEntity, String>chunk(5000, transactionManager)
.reader(myItemReader)
.processor(myItemProcessor)
.writer(myItemWriter)
.build();
}
@Bean
Job job(JobRepository jobRepository,
Step step) {
return new JobBuilder("fileJob", jobRepository)
.incrementer(new RunIdIncrementer())
.start(step)
.build();
}
The problem is I have 12703 records in database. But I can see only 7703 records written to output file. Exactly chunk size(5000) data is missing in the output file.
I have tried adding chunk listener
@Log4j2
public class MyChunkListener implements ChunkListener {
private int counter;
@Override
public void afterChunk(ChunkContext context) {
log.info("Completed chunks: {}, Chunk complete status: {}", counter, context.isComplete());
}
@Override
public void beforeChunk(ChunkContext context) {
log.info("Chunk started, Context: {}", context.getStepContext().getStepExecutionContext());
counter++;
}
@Override
public void afterChunkError(ChunkContext context) {
log.info("Chunk failed, Context: {}", context.getStepContext().getStepExecutionContext());
log.error("Chunk number " + counter + " failed");
}
}
@Bean
public Step step(JobRepository jobRepository,
PlatformTransactionManager transactionManager,
MyItemReader myItemReader,
MyItemProcessor myItemProcessor,
MyItemWriter myItemWriter) {
return new StepBuilder("fileGenerationStep", jobRepository)
.<MyEntity, String>chunk(5000, transactionManager)
.listener(new MyChunkListener())
.reader(myItemReader)
.processor(myItemProcessor)
.writer(myItemWriter)
.build();
}
I have expected to see chunk number [counter] failed.
But I didn't see any logs related chunk failure.
I found only 2 chunks of data written, I am unable to find what happens the other chunk of data. It should be 3 chunks of data
I have even tried chaning the scope of itemreader and itemwriter from stepscope to jobscope, but nothing worked.