I have a database with +8million records which I need to processes in particular ways which is written in Java. After looking some stuff up, I found following related posts:
- Efficient way to go over result set in Java
- Fastest way to iterate through large table using JDBC
- how do I load 100 million rows in to memory
- Retrieving a million records from a database
This is my code that returns the items stored in the column Tags of my MySQL database:
public ResultSet getAllTags() {
String query = "SELECT Tags FROM dataset";
ResultSet rs = null;
try {
connection = ConnectionFactory.getConnection(DATABASE);
preparedStatement = connection.prepareStatement(query, ResultSet.TYPE_SCROLL_SENSITIVE, ResultSet.CONCUR_READ_ONLY);
preparedStatement.setFetchSize(Integer.MIN_VALUE);
rs = preparedStatement.executeQuery(query);
// following line is for testing, to see what comes out of the resultset
System.out.println("output: " + rs.getString(1));
return rs;
} catch (Exception ex) {
ex.printStackTrace();
return null;
} finally {
closeAll();
}
}
Here I return the ResultSet in order for me to process each line in the rs.next()
loop. However, at the line of rs = preparedStatement.executeQuery(query);
it starts to eat all my computer's free memory (I work on Mac OSX with 8GB of RAM. With only Eclipse open I have +/- 5GB left free, when running application it goes down till < 100MB free) making me to shut down the database connection and application etc... So I assume this can be called a Memory Leakage?
Can someone explain what I'm doing wrong and why this issue occurs even when I follow the instructions found on other pages with similar amount of records?