I have a code that looks like this.
List<Long> eventIds = ... // just arraylist of ids
Iterable<List<Long>> partitions = Iterables.partition(eventIds, 10); // eventIds are partitioned into smaller chunks so one call to database would require less resurces
Map<Integer, YearlyStatistics> yearlyStatisticsMap = new HashMap<>();
for (List<Long> partition : partitions) {
List<Event> events = database.getEvents(partition); // This is where I get OutOfMemoryException after couple of loops. It looks like previous list of events is never garbage collected.
populateStatistics(events, yearlyStatisticsMap);
}
One Event
is never larger than 1MB. JVM has 250MB of memory to work with.
Reason for partitioning the eventIds
is that I will definitely run out of memory if I try to fetch every object at once from the database. So I thought that I will ask data in smaller chunks and JVM will clear up memory after every loop after populateStatistics
method is called. Looks like this is not the case as at around ~50th loop OutOfMemoryException is thrown. Is there any way to optimize this code so memory from previous events is freed up?