I am working on a project, in which need to load thousands of object's data in HashMap/Hashtable/ArrayList. No issue with small application but it goes out of memory in large application.
Please suggest how to handle this situation?
I am working on a project, in which need to load thousands of object's data in HashMap/Hashtable/ArrayList. No issue with small application but it goes out of memory in large application.
Please suggest how to handle this situation?
I am wondering about your requirement here, why is it important to load thousands of objects at the same time? Can you provide more details about it? Perhaps your implementation can be reworked so that you don't need that many objects loaded in memory.
don't read all the data into memory at once, or expand the memory available prior to execution.
You can not increase heap size programmatically.
Either you have to increase the memory or u have to check through the code whether there is some point where application is creating many objects(probably in loops) . If it is there nullify that objects(make sure it will not affect your application flow). Another option try to use a lighter object (say A bean can be made lighter to a string object if you can properly override the toString() method. This will also increase the performance of your application)
You could use a cache system, like ehcache for example. That would give you some control over the "memory" used. There's other cache implementation, ehcache might not suit your needs.