I have a serialized object on disk of a patricia trie(https://commons.apache.org/proper/commons-collections/apidocs/org/apache/commons/collections4/trie/PatriciaTrie.html). On disk, it occupies roughly 7.4 GB. I am using a 64 GB RAM server. When deserialized, the memory consumption of the corresponding process goes up till 40 GB. Is this sensible, because the highest voted answer at Serialized object size vs in memory object size in Java says that "the size in memory will be usually between half and double the serializable size!" I was expecting the in memory size to not go beyond 15 GB, but 40 GB is too much as other processes would be loaded as well.
I thought of using http://docs.oracle.com/javase/7/docs/api/java/lang/instrument/Instrumentation.html for measuring size in memory, but Calculate size of Object in Java says that it "can be used to get the implementation specific approximation of object size." So, it would be again approximate measure only.
Is there something I am missing here. I am closing the file and bufferred reader as well. What could be hogging all the memory? I can't share the code for corporate policy reasons - any help or pointers would be highly appreciated. Thanks