2

I am trying to analyze a 200GB heapdump on a remote machine. The heap dump was created by through visualVMs "Create Heap Dump" button. Even executing it with Xmx300GB ("$(dirname -- "$0")"/MemoryAnalyzer -consolelog -application org.eclipse.mat.api.parse "$@" -vmargs -Xmx300g -XX:-UseGCOverheadLimit) causes MAT to crash with the following error:

eclipse.buildId=unknown
java.version=11.0.3
java.vendor=Oracle Corporation
BootLoader constants: OS=linux, ARCH=x86_64, WS=gtk, NL=en_US
Framework arguments:  -application org.eclipse.mat.api.parse ./tmp/heapdump-1577977940574.hprof
Command-line arguments:  -os linux -ws gtk -arch x86_64 -consolelog -application org.eclipse.mat.api.parse ./tmp/heapdump-1577977940574.hprof

!ENTRY org.eclipse.osgi 4 0 2020-01-06 16:25:11.593
!MESSAGE Application error
!STACK 1
java.lang.OutOfMemoryError: Requested length of new long[2,147,483,640] exceeds limit of 2,147,483,639
        at org.eclipse.mat.parser.index.IndexWriter$Identifier.add(IndexWriter.java:91)
        at org.eclipse.mat.hprof.HprofParserHandlerImpl.reportInstance(HprofParserHandlerImpl.java:588)
        at org.eclipse.mat.hprof.Pass1Parser.readPrimitiveArrayDump(Pass1Parser.java:590)
        at org.eclipse.mat.hprof.Pass1Parser.readDumpSegments(Pass1Parser.java:366)
        at org.eclipse.mat.hprof.Pass1Parser.read(Pass1Parser.java:175)
        at org.eclipse.mat.hprof.HprofIndexBuilder.fill(HprofIndexBuilder.java:80)
        at org.eclipse.mat.parser.internal.SnapshotFactoryImpl.parse(SnapshotFactoryImpl.java:222)
        at org.eclipse.mat.parser.internal.SnapshotFactoryImpl.openSnapshot(SnapshotFactoryImpl.java:126)
        at org.eclipse.mat.snapshot.SnapshotFactory.openSnapshot(SnapshotFactory.java:145)
        at org.eclipse.mat.internal.apps.ParseSnapshotApp.parse(ParseSnapshotApp.java:134)
        at org.eclipse.mat.internal.apps.ParseSnapshotApp.start(ParseSnapshotApp.java:106)
        at org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:196)
        at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:134)
        at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:104)
        at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:388)
        at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:243)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:566)
        at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:656)
        at org.eclipse.equinox.launcher.Main.basicRun(Main.java:592)
        at org.eclipse.equinox.launcher.Main.run(Main.java:1498)
        at org.eclipse.equinox.launcher.Main.main(Main.java:1471)

Could the file be corrupted, or is it simply too big to be analyzed?

awildturtok
  • 173
  • 1
  • 13
  • It is too big. You need more ram or increase the memory limit. – f1sh Jan 06 '20 at 16:13
  • @f1sh , It is not too big per se, it is too many objects in the hprof file (2^31). Increasing the heap size for MAT unfortunately does not help here. – jasonk May 06 '21 at 01:52

2 Answers2

2

You have most likely encountered a known limitation of Memory Analyzer as per this comment:

"Memory Analyzer has an architectural limit of 2^31 - 3 objects, a current limit of 2^31 - 8 = 2,147,483,640 objects, but has not been tested with that many objects. The current record is a heap dump file of 48Gbytes containing 948,000,000 objects, which was opened with Memory Analyzer running with a 58Gbyte heap."

See also https://dev.eclipse.org/mhonarc/lists//mat-dev/msg00324.html

No matter how large is the heap size Java won't allow to create an array larger than Integer.MAX_VALUE - 5 which is 2,147,483,639.

You could try downloading latest Memory Analyzer version and see if the limit was improved.

Karol Dowbecki
  • 43,645
  • 9
  • 78
  • 111
  • Thanks for your answer! The version I'm using should be from September last year, so reasonably fresh. Do you know of a tool that can handle heap dumps this big? – awildturtok Jan 06 '20 at 16:51
  • No, sorry. Do note that [heapdump format is not precise](https://shipilev.net/blog/2014/heapdump-is-a-lie/) and with 200GB worth of heap data you might get cumulative values which are of by large margin. – Karol Dowbecki Jan 06 '20 at 16:54
  • We are mostly interested in the population of the heap dump, not necessarily the specific size/memory usage. Something must have leaked execessively and this dump is the only available info I have. – awildturtok Jan 06 '20 at 17:13
  • JVisualVM lets you open a heapdump and has a class view that counts instances. – Karol Dowbecki Jan 06 '20 at 17:19
0

The latest snapshot build of Eclipse Memory Analyzer has a facility to randomly discard a certain percentage of objects to reduce memory consumption and allow the remaining objects to be analyzed. See Bug 563960 - Requested length of new long[2,147,483,640] exceeds limit of 2,147,483,639 and the nightly snapshot build to test this facility before it is included in the next release of MAT.

The help inside MAT from a snapshot build explains how to use this new facility. See 'Memory Analyzer Configuration'.

See this answer to a similar problem: Tool for analyzing large Java heap dumps.

Do let us know if it helps.

[Edited answer with more information]

user13762112
  • 401
  • 3
  • 7
  • That is great news, thank you a lot! We have recently been having problems with our large heap and this is helping us a lot instead of spinning up much smaller instances and assuming they are representative. – awildturtok Oct 18 '20 at 18:00
  • Thank you, I was able to use this locally to capture a larger dump file. – jasonk May 06 '21 at 01:52