I have a code like this below,
try (Connection connection = this.getDataSource().getConnection();
PreparedStatement statement = connection.prepareStatement(sqlQuery);) {
try {
statement.setFetchSize(10000); // Set fetch size
resultSet = statement.executeQuery();
while (true) {
resultSet.setFetchSize(10000);
boolean more = resultSet.next();
if (! more) {
break;
}
// populating an arraylist from the value from resultSet
}
}
catch (Exception e) {
LOGGER.error("Exception : "+e);
}
} catch (SQLException e) {
LOGGER.error("Exception : "+e);
}
My understanding is as follows,
The statement fetch size is 10000. when statement.executeQuery() is executed, it returns the ResultSet cursor. It will have 10000 rows in memory. When resultSet.next is called, it gets one row from the memory buffer. (one row per call). When no more rows in memory, the query is fired again and 10000 rows are fetched from database again and stored in buffer. This continues until there is no rows to be fetched from DB
So if my understanding is correct, how many actual DB calls will be there for a total rows of 210000? Is it 21 ? (210000 / 10000)
Also when and who calls the DB (when rows in buffer are all read) to get more rows (10000, in my case) and store in buffer. Also when is the buffer cleared ?
Please correct me if I am wrong in my understanding.
I need to work with millions of data in Oracle Database.
Thanks for any pointers/info
Regards,
SD