In an old project, I have a table with 3 indices on 3 columns, I am executing batch queries (~= 5000 update queries) like UPDATE mytable set mycolumn = 'blah' when myIndexedColumn = 'someId'
.
The executeBatch command, takes 1h:30 approximately, I use oracle 11g database, and Java 6, Spring batch also. The concerned table holds 1 700 000 rows.
Statement ps = null;
Connection connection = null;
try {
int counter = 0;
connection = myDatasource.getConnection();
connection.setAutoCommit(false);
ps = connection.createStatement();
for (String request : myListOfRequests) {
ps.addBatch(request);
}
ps.executeBatch();
connection.commit();
} catch (SQLException ex) {
LOGGER.error("My Errors : {}", ex.getMessage());
} finally {
if (ps != null) {
ps.clearBatch();
ps.close();
}
if (connection != null) connection.close();
}
I have dropped the indices, but I didn't noticed a significant difference. I cannot use modern technologies. Also deleting and rebuilding a new table is not a secured task. So have you an idea how I can improve this task?