15

Using the -Xmx1G flag to provide a heap of one gigabyte, the following works as expected:

public class Biggy {
    public static void main(String[] args) {
        int[] array = new int[150 * 1000 * 1000];
    }
}

The array should represent around 600 MB.

However, the following throws OutOfMemoryError:

public class Biggy {
    public static void main(String[] args) {
        int[] array = new int[200 * 1000 * 1000];
    }
}

Despite the array should represent around 800 MB and therefore easily fit in memory.

Where's the missing memory gone?

dagnelies
  • 5,203
  • 5
  • 38
  • 56

3 Answers3

20

In Java you typically have multiple regions (and sub regions) in the heap. You have a young and tenured region with most collectors. Large arrays are added to the tenured area straight away however based on your maximum memory size, some space will be reserved for the young space. If you allocate memory slowly these regions will resize however a large block like this can simply fail as you have seen.

Given memory is usually relatively cheap (not always the case) I would just increase the maximum to the point where you would want the application fail if it ever used that much.

BTW: If you have a large structure like this you might consider using direct memory.

IntBuffer array = ByteBuffer.allocateDirect(200*1000*1000*4)
                            .order(ByteOrder.nativeOrder()).asIntBuffer();

int a = array.get(n);
array.put(n, a+1);

Its a bit tedious to write but has one big advantage, it uses almost no heap. (there is less than 1 KB over head)

Peter Lawrey
  • 525,659
  • 79
  • 751
  • 1,130
  • 1
    Then how does JVM garbage collect those memory? – Clark Bao Sep 04 '11 at 10:54
  • "it uses almost no heap" - does this mean it sends the JVM to allocate more from the OS's heap, above and beyond the -Xmx specification? – JustJeff Sep 04 '11 at 11:08
  • @Clark Bao - Pure speculation, but I would suspect that if it's getting a special allocation from the OS, the JVM takes the attitude of "on your own head be it". I.e., if *you* want to allocate it this way, *you* can worry about freeing it. – JustJeff Sep 04 '11 at 11:09
  • 1
    Typically you don't want to be discarding such a large structure. However, when the IntBuffer is collected it contains a Cleaner+Deallocator objects which does a freeMemory when its cleaned up. You can force it to free the memory without a GC in the Sun/Oracle jvm by calling `((DirectBuffer) buffer).cleaner().clean();` but this uses an internal API. – Peter Lawrey Sep 04 '11 at 11:12
  • 1
    @JustJeff, The direct memory uses native C memory. It gets more memory from the OS as required. The maximum direct memory is the same as the maximum heap size by default in Java 6. But it is above and beyond the heap and has no impact on GC times. – Peter Lawrey Sep 04 '11 at 11:14
  • 1
    @JustJeff, For you interest you can allocate large areas of C memory using the Unsafe class which you have to free explicitly. It does allow you to create a block memory of any size the OS supports. This is definitely a use as your own risk feature. – Peter Lawrey Sep 04 '11 at 11:16
  • @Peter Lawrey - agreed, it's hard to see in what sense a 200M element array would ever be 'temporary'. Anyway, thanks for the hints about the Dangerous Toys =) – JustJeff Sep 04 '11 at 11:23
  • 1
    Very interesting and insightful discussion. :) – Clark Bao Sep 04 '11 at 11:30
  • How do I make this for a multi-array? Like, an int[][][]. – Joehot200 Feb 08 '15 at 22:49
  • @Joehot200 you can either have an array of ByteBuffer or you can have a ByteBuffer which contains two or three dimensions by using multiplication. – Peter Lawrey Feb 08 '15 at 22:52
  • @PeterLawrey "Two or three dimensions by using multiplication", what does that mean? I cannot use an array of ByteBuffer's for my purposes. – Joehot200 Feb 08 '15 at 22:53
  • @Joehot200 the same way multidimensional arrays work in C. Exactly how you best do it depends on your requirements. – Peter Lawrey Feb 08 '15 at 23:08
3

There is enough memory available but not as a single continuous block of memory, as needed for an array.

Can you use a different data structure that uses smaller blocks of memory, or several smaller arrays?

For example, the following code does work with -Xmx1G:

public class Biggy {
    public static void main(String[] args) {
        int [][]array = new int[200][];
        for (int i = 0; i < 200; i++) {
                array[i] = new int[1000 * 1000];
                System.out.println("i=" + i);
        }
    }
}
Simon C
  • 1,977
  • 11
  • 14
1

Heap memory is divided between three spaces:

  • Old Generation
  • Survivor Space
  • Eden Space

At start this object will live in the old generation and will remain here for a while.

By default, the virtual machine grows or shrinks the heap at each collection to try to keep the proportion of free space to live objects at each collection within a specific range. This target range is set as a percentage by the parameters -XX:MinHeapFreeRatio= and -XX:MaxHeapFreeRatio=, and the total size is bounded below by -Xms and above by -Xmx.

Default ratio in my jvm is 30/70 so max size of object in old generation is limited (with -Xmx1G) by 700Mb(btw, I'm getting the same exception when running with default jvm parameters).

However you could size generations using jvm options. For example you could run your class with parameters -Xmx1G -XX:NewRatio=10 and new int[200 * 1000 * 1000]; will succeed.

From what I could say Java wasn't designed to hold large objects in memory. Typical usage of memory in application is graph of bunch of relatively small objects and typically you'll get OutOfMemoryError only if you run out of space in all of spaces.

Below are couple useful (and interesting to read) articles:

Ergonomics in the 5.0 Java[tm] Virtual Machine

Tuning Garbage Collection with the 5.0 Java[tm] Virtual Machine

Josh Milthorpe
  • 956
  • 1
  • 14
  • 27
Petro Semeniuk
  • 6,970
  • 10
  • 42
  • 65