what I'm trying to do is to read 1,048,575 records from Oracle DB. The number is not random, it's the max number of rows in Excel file reduced by 1 (I want to have all rows filled + there must be a title row).
My project is built on Spring 4. I'm using org.springframework.data.querydsl.QueryDslPredicateExecutor
to get data from database, so basically paging, sorting and QueryDslPredicates
are required.
When I tried to get all 1,048,575 records I got exception:
java.lang.OutOfMemoryError: GC overhead limit exceeded
. After googling it for a while (for example there) I haven't found any memory leaks. After splitting all records into 100k chunks, it crushed between 500k and 600k.
Project is run on Jetty 9.3 with the following arguments:
-Xms512m -Xms2048m -Xmx2048m
DB is configured well, it has proper structure and required indexes. Let's say it's immutable.
Having all those in mind, is there any possibility I can actually read that data without increasing heap size?