I need to use a HashMap whose keys are of Long
datatype and values are some user-objects, defined as:
HashMap <Long,SomeClass> dummy=new HashMap<>();
Initially this dummy
hashmap contains about 10 million <key,value>
pairs, added using
dummy.add(SomeLong,new SomeClass(SomeParameters);
The memory consumed is 7-8GB. After creating this map, 80-90% of entries are to be removed:
for (Iterator<Entry<Long, SomeClass>> it = collisionMap.entrySet().iterator(); it.hasNext();) {
Map.Entry<Long,SomeClass> entry = it.next();
if(SomeCondition) {
it.remove();
}
}
The memory being used is still same after removing these entries. I checked with Runtime.getRuntime().totalMemory()
and Runtime.getRuntime().freeMemory()
.
Now the problem is why the memory has not been reclaimed after this remove()
operation?
I am creating this type of hashmap about 1000-2000 times during the course of program. And it is giving java.lang.OutOfMemoryError: GC overhead limit exceeded
error :(
Can anyone help? Thanks
***************** Update/Additional information ****************
The SomeClass
is defined as:
Class SomeClass {
private ArrayList <Integer> list1;
private ArrayList <Integer> list2;
public SomeClass(int l,List <Integer> l2) {
list2=l2;
list1=new ArrayList<>();
list1.add(l);
}
public void addList1(int l) { list1.add(l); }
public ArrayList <Integer> getList1() { return list1; }
public ArrayList <Integer> getList2() { return list2; }
}
For some reasons, I had earlier planned to use BitSet
datatype instead of ArrayList list1
. And if I replace list1' by a
BitSetvariable and set the
l^thbit instead of adding
lto
list1' and then create dummy HashMap with the objects of this class, then the results about memory are different. Surprisingly, the memory is reclaimed after remove
operation on HashMap. E.g. After removing about 80% of entries in HahsMap memory being used is also reduced by 40% approx.
Is it that memory allocated to an ArrayList is not being reclaimed? :( :(