In my code, I have been querying a list of 10k items using findAll of repository.
There is only one List
reference to which I assign the result of findAll
then run a loop for the items of List
.
Total records in DB are usually few times of 10K i.e. loop currently iterates somewhere between 6 - 12 times.
What I observed is that successive processing in that loop starts taking more time after around 20K read items or so.
Instead of assigning reference if I use List.addAll()
for findAll
items and before adding next chunk of items, I clear out list using List.clear()
then execution time of iteration remains constant & doesn't increase successively.
Code with iteration increasing time,
while(condition){
List<T> reference = repo.findAll()
for(T t:reference){
//Processing
}
//Check Condition if its false
}
Constant Time ,
List<T> reference = new ArrayList<>();
while(condition){
reference.addAll(repo.findAll())
for(T t:reference){
//Processing
}
reference.clear();
//Check Condition if its false
}
Not sure why the gap as objects should be garbage collected in first instance too ?
Related question - list.clear() vs list = new ArrayList()