I am receiving a GC Overhead Limit exceeded error when populating a hashmap with >100000 objects.
When my program starts, it reads from a CSV file key:value pairs. It then builds a hashmap which contains a string as the key and a hashset of objects for each value.
Its convenient to keep this method because at the end, I print statistics based on these mappings.
I see a few options: - Reduce object size. Will reduce the problem but may persist with more objects. - configure default map size and load factor. Same as above. - Increase heap size. Same as above. - Process objects sequentially and discard. Will fix problem but will lose mapping of objects. - Offload storage to DB?
Greatly appreciate your thoughts.