1

I always thought the maximum array size in Java is 2³¹-1 however I fail to create a byte[] that is larger than 2³¹-4 on HotSpot. If i do new byte[Integer.MAX_VALUE] on HotSpot I get:

java.lang.OutOfMemoryError: Requested array size exceeds VM limit

I wrote the following program to find out the maximum array size

  private static void printMaxArraySize() {
    int maxArraySize = findMaxArraySize();
    System.out.println("max array size: " + maxArraySize);
    System.out.println("max array size - Integer.MAX_VALUE: " + (Integer.MAX_VALUE - maxArraySize));
  }

  private static int findMaxArraySize() {
    for (int i = Integer.MAX_VALUE; i >= 0; i--) {
      try {
        byte[] data = new byte[i];
        if (System.identityHashCode(data) == 42) {
          System.out.println();
        }
        return i;
      } catch (OutOfMemoryError e) {
        // ignore
      }
    }
    throw new AssertionError("not able to create a byte[0]");
  }

And even with -Xmx60g -Xms60g -XX:+UseSerialGC it prints, I tested with JDK 8 and JDK 10

max array size: 2147483644
max array size - Integer.MAX_VALUE: 3

OpenJ9 on the other hand seems to support 2³¹-1 sized byte[] just fine

max array size: 2147483647
max array size - Integer.MAX_VALUE: 0

Edit

After formulating the question SO suggests the issue is JDK-8059914

Edit 2

The bug suggests size_t is now used instead of int so the object size limit should really be 4GB and new byte[Integer.MAX_VALUE] should work.

Edit 3

If I lower -Xmx I also get Integer.MAX_VALUE - 2.

Philippe Marschall
  • 4,452
  • 1
  • 34
  • 52
  • On my Java 10, I get `Integer.MAX_VALUE - 2`, which is much better than what I got in the past, the last time I checked the JVM’s limit. While I always knew about the existence of a JVM specific limitation, I never read an explanation about why. Now, reading JDK-8059914, I must say that, cough, it is…not very convincing. If the object type is supposed to fit into a 32 bit `int`, that will fail for any array type except `byte[]` and `boolean[]` anyway, as all other types require more than one byte per element. In contrast, there’s never a problem with `byte[Integer.MAX_VALUE]` and 64 Bit `int`… – Holger Jul 03 '18 at 10:49
  • So, it is connected to the compressed oops resp. compressed klass pointer feature, which is available for heaps smaller than 32GB (unless you change the object alignment). Your question [has been asked before](https://stackoverflow.com/q/43592109/2711488) and there is [this related question](https://stackoverflow.com/q/31382531/2711488). There more I read about it, the less sense does all that make to me. – Holger Jul 04 '18 at 14:21

0 Answers0