I am using below code to insert records from a large csv (100,000 records) in Oracle using jOOQ. Here's the snippet:
CSVReader csvReader = null;
String csvError;
try {
csvReader = new CSVReader(new FileReader(tempFile));
} catch (FileNotFoundException e) {
e.printStackTrace();
}
//delete previous uploaded stuff from the same user
clearTable(user.getUserId());
List<BSearchRecord> records = new ArrayList<>();
boolean isFirst = true;
while (csvReader.hasNext()) {
if(isFirst) {
//validate headers
String[] headers = csvReader.next();
uploadedHeaders = headers;
csvError = validateHeadersLength(headers);
if(csvError != null) {
return csvError;
}
for(int i=0; i<headers.length; i++) {
csvError = validateEachHeader(i, headers[i]);
if(csvError != null) {
return csvError;
}
}
isFirst = false;
continue;
} else {
String[] row = csvReader.next();
if(row != null) {
BSearchRecord bSearchRecord = new BSearchRecord();
bSearchRecord.set(RET_BSEARCH.UPLOADEDBY, user.getUserId());
for(int i=0; i<csvHeaders.length; i++){
Field field = bSearchRecord.field(backendColumns[i]);
bSearchRecord.set(field, row[i]);
}
records.add(bSearchRecord);
}
}
}
db.batchInsert(records).execute(); // IS THIS OKAY ? (is this batch enabled?)
I went across some suggestions like: PostgreSQL/JooQ bulk insertion performance issues when loading from CSV; how do I improve the process?
However, my use-case was a bit different, so asking this just to get a suggestion, whether I am doing it in the right way or not?
Also, can you suggest, does batchInsert(..) implementation in jOOQ support batch execution ? (in the docs, I saw .bind(..) approach, so asking this for clarity)