I have a spring batch with flows, steps, readers, processors, writers which is copying data from one DB source to another (being an Oracle DB). The batch is working fine and fast with some millions of data. I now want to reuse this same exact method below to write the same data in a PostgreSQL DB hence the getSessionFactory()
method call returning the appropriate session factory depending on my needs. But the problem is that nothing seems to happen or it is just anormally very long.
I would like to know why there is such a difference. Is there some kind of limitation on the number of inserts we can do in parallel with PostgreSQL? Or is there something wrong with this code?
public void myMethod(List<MyObject> list) {
SessionFactory sessionFactory = getSessionFactory();
sessionFactory.getCurrentSession().doWork(connection -> {
StringBuilder stringBuilder = new StringBuilder("INSERT INTO \"SOME_TABLE\" (\"ID\", \"SOME_COLUMN\", \"UPDATED_AT\") VALUES (?, ?, ?)");
try (PreparedStatement preparedStatement = connection.prepareStatement(stringBuilder.toString())) {
int i = 0;
for (MyObject myObject : list) {
preparedStatement.setBigDecimal(1, new BigDecimal(myObject.getId()));
preparedStatement.setBigDecimal(2, new BigDecimal(myObject.getSomeColumnValue()));
preparedStatement.setDate(3, new Date(System.currentTimeMillis()));
preparedStatement.addBatch();
if (++i % 500000 == 0 || i == list.size()) {
preparedStatement.executeBatch();
}
}
}
});
}