I have a huge CSV file
- need to read it
- validate
- write to db
After research, I found this solution
//configure input format using
CsvParserSettings settings = new CsvParserSettings();
//get an interator
CsvParser parser = new CsvParser(settings);
Iterator<String[]> it = parser.iterate(new File("/path/to/your.csv"), "UTF-8").iterator();
//connect to the database and create an insert statement
Connection connection = getYourDatabaseConnectionSomehow();
final int COLUMN_COUNT = 2;
PreparedStatement statement = connection.prepareStatement("INSERT INTO some_table(column1, column2) VALUES (?,?)");
//run batch inserts of 1000 rows per batch
int batchSize = 0;
while (it.hasNext()) {
//get next row from parser and set values in your statement
String[] row = it.next();
//validation
if (!row[0].matches(some regex)){
badDataList.add(row);
conitunue;
}
for(int i = 0; i < COLUMN_COUNT; i++){
if(i < row.length){
statement.setObject(i + 1, row[i]);
} else { //row in input is shorter than COLUMN_COUNT
statement.setObject(i + 1, null);
}
}
//add the values to the batch
statement.addBatch();
batchSize++;
//once 1000 rows made into the batch, execute it
if (batchSize == 1000) {
statement.executeBatch();
batchSize = 0;
}
}
// the last batch probably won't have 1000 rows.
if (batchSize > 0) {
statement.executeBatch();
}
// or use jook#loadArrays
context.loadInto("book")
.batchAfter(500)
.loadArrays(new ArrayList <String[]>)
However, it is still too slow because it's executing in same thread. Is there any way to do it faster with multi-threading?