I have a simple csv file that i am reading in chunk of 1000, inserting to database. Now if i want to check if the row exists in db and is equal to input before insert, if exists and row is equal -ignore, else insert or update, I am using ItemProcessor. After implementing this, realized the jdbc call is too slow(120ms avg) and wanted to batch the ids before calling the db and check with input. In this stage ItemReader was passing to ItemProcessor one item at a time. Now I am trying to pass 1000 items at once to ItemProcessor so the jdbc call can be batched. While trying this, was able to check some example but unable to get reader work. This is sample code.
public class customReader implements ItemReader<List<T>> {
private static List<T> records = null;
ItemReader<String> itemReader;
@Autowired customDao customDao;
private int index = 0;
@Override public List<T> read() throws Exception {
//reader logic
//while(records.size() < 1000){
String record = itemReader.read();
if(Objects.isNull(record)){
break;
}
records.add(record);
}
return (List<T>) records;
}
This is config
@Bean
return stepBuilderFactory
.get("step")
.<List<String>, List<String>>chunk(1000)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
If any one has a simple sample where they can pass a list of 1000 rows of csv file to processor, please share the example. Checked the example shared: Making a item reader to return a list instead single object - Spring batch Getting exceptions, unchecked call to processor, writer etc with above.
Spring Batch - Item Reader and ItemProcessor with a list Checked above as well but got exceptions like listed in the comment and unchecked type etc. Please kindly share if you have working sample of reader, processor and writer with multiple rows in one transaction without multi-threading.