1

My Database is hosting on mysql server & I'm using Java to analyze data.

My issue: after execute 'Select' query will return a 2.5 GB result set. I don't want to load all the data to memory. So is there any ways that I could continuously retrieve data & process it?

'limit by rows' will not be an option, b/c this 2.5 GB data is joined & retrieved from 4 tables. So 'limit by rows' will increase my total run-time a lot.

I've tried statement.setFetchSize(50), but it seemed not working as I expected.

Any suggestions will be really appreciated! Thanks!

Nick Ren
  • 65
  • 3
  • 10
  • 1
    I would increase your heap size. If you need to process 2.5 GB of data, it doesn't sound like a lot to have 8-32 GB of memory. – Peter Lawrey May 31 '13 at 15:25
  • You want to take a look here: http://stackoverflow.com/questions/2447324/streaming-large-result-sets-with-mysql – Farlan May 31 '13 at 15:50

2 Answers2

1

Statement stmt = readOnlyConn.createStatement(java.sql.ResultSet.TYPE_FORWARD_ONLY, java.sql.ResultSet.CONCUR_READ_ONLY);

stmt.setFetchSize(Integer.MIN_VALUE);

The code above solved my issues. Thanks for the help!

Nick Ren
  • 65
  • 3
  • 10
1

BlockquoteStatement stmt = readOnlyConn.createStatement(java.sql.ResultSet.TYPE_FORWARD_ONLY, java.sql.ResultSet.CONCUR_READ_ONLY); stmt.setFetchSize(Integer.MIN_VALUE); The code above solved my issues. Thanks for the help!

Yes, BUT if you are using postgres, you also have to have autocommit turned OFF! (realised after 2h of work =D )

see postgres docs here

Jaroslav
  • 867
  • 12
  • 19
  • Hi Jaroslav, this does not provide an answer to the question - it really belongs as a comment on the other answer, but you will need to earn more reputation to be able to [leave comments](http://stackoverflow.com/help/privileges/comment), so in the meantime you need to concentrate on posting good-quality original answers and questions which can earn you reputation. – Vince Bowdren Nov 02 '16 at 17:21