0

I have got a Heap Dump file of Size 10GB in Hprof format from production system. I tried analyzing it with MAT, Jvisual VM and Jprofiler.

I have to conduct analysis of the same for memory issue and bottleneck.

The loading of file hangs application in middle and then I have to restart system. Even tried it on Server with application memory set as high as 15GB but it did not work.

Could you please help me finding a solution for analyzing the same.

  • Does this answer your question? [Tool for analyzing large Java heap dumps](https://stackoverflow.com/questions/7254017/tool-for-analyzing-large-java-heap-dumps) – rogerdpack Sep 15 '21 at 18:01

2 Answers2

1

Analyzing the dump on a server with 15G should work, but you need to set the Java heap size accordingly. For example, if you are using the memory analyzer plugin in Eclipse (I usually do) you need to edit eclipse.ini and change -Xmx1024m to -Xmx14g. Your file may not contain 1024m and 14g is just a suggestion, but the maximum heap size needs to be larger than the dump.

ewramner
  • 5,810
  • 2
  • 17
  • 33
0

Better to use headless MAT option. You can analyze the heap dump in remote unix server having enough RAM. (RAM should be larger than the heap dump hprof file).

I was able to analyze heap dump ~20 GB. Check the link. https://stackoverflow.com/a/76298700/5140851

Vishy Anand
  • 93
  • 1
  • 10