12

For a heuristic precomputed table i need a byte array with 1504935936 entries. This should take about 1.5 GB of Memory.

public class Main{
    public static void main(String[] args){
        byte[] arr = new byte[1504935936];
    }
}

Why do I have a "OutOfMemoryError: Java heap space"-Exception, if I give the program 2 GB of RAM with

java -Xmx2048M Main

With

java -Xmx2153M Main

it works. Why does it need that much RAM?

user207421
  • 305,947
  • 44
  • 307
  • 483
Jakube
  • 3,353
  • 3
  • 23
  • 40
  • 4
    To allocate an object there must be that much CONTIGUOUS space available in the heap. If the heap is 2G, it's highly unlikely that there is 1.5G of contiguous space. (In addition, various platforms have other limits on the maximum size of an individual object, based on the way their address translation hardware works.) – Hot Licks Jul 03 '14 at 18:20
  • @HotLicks I don't think that's true for all JVMs (the contiguous part - and I know of at least one where it's definitely not true, although it's pretty niche one). – vanza Jul 03 '14 at 19:06
  • 2
    @vanza - There's an exception for darn near everything. But any JVM that allowed non-contiguous allocations of arrays would probably be one designed for tiny systems. There are significant (negative) performance implications to allowing for that. – Hot Licks Jul 03 '14 at 20:42
  • Just out of idle curiosity, what are you doing that requires that specific number of entries? – corsiKa Jul 03 '14 at 22:37

5 Answers5

5

Probably because the Java heap is being used and fragmented by other data in your program.

That byte array needs to be allocated as one contiguous 1.5 GB chunk of memory within the Java heap space. (This isn't required by the Java language spec, but AFAIK is how all current JVM implementations actually work.) Some of your heap space is being consumed and – probably more importantly – fragmented by the other memory allocations that happen in your program prior to allocating that big byte array. That java -Xmx2153M Main may be how big you have to make the overall heap for there to be a contiguous 1.5 GB space left by the time you get to the allocation.

If you chop up that big array into 100 smaller arrays 1/100th the size, it may fit in to a smaller heap because it's not as sensitive to heap fragmentation.

Andrew Janke
  • 23,508
  • 5
  • 56
  • 85
  • 2
    Aaaaactually, before you Accept this, take a closer look at @jtahlborn's answer (http://stackoverflow.com/a/24561535/105904), and use the `-XX:+PrintGCDetails` switched described in http://stackoverflow.com/questions/24202523/java-process-memory-check-test. The multiple-pools aspect of the heap may be a more direct cause of your issue and the apparent size discrepancy. – Andrew Janke Jul 03 '14 at 19:44
5

The other posts here have some good info, but they missed a key point:

Get a good memory profiler (preferably one with a visual display) and attach it to your jvm. What you will see is that a modern jvm does not have one large heap space, but will instead have multiple pools (also called generations). Typically, the "old generation" is the largest, but you will also have a few others. Together, all these pools should add up to roughly the heap space you allowed for the jvm.

Thus, your "-Xmx2048M" setting does not result in a heap with a single pool which can support a 1.5GB array (as noted by others, you need a single contiguous chunk of memory for the array, i.e. a chunk of memory entirely contained in a single pool/generation).

jtahlborn
  • 52,909
  • 5
  • 76
  • 118
4

If the process is executing as a 32-bit process, most OSes only retain about 2GB of address space for the process, the other 2GB of address space is mapped for kernel stuff (so that when your process calls kernel stuff, you don't have to perform as many context switches).

Even if your machine has 8 GBs of ram, or 2 GBs with 2GBs of swap, each 32-bit process would only be able to allocate and address 2GB, unless you use PAE or similar.

This causes a few problems. One, you may not have enough raw address space to store the total size of all allocations. Two, you may not have a single contiguous chunk of memory that is the size of the array you need - Java and several other VM environments use separate heaps to store different types of memory, eg, a large object heap separate from gen 0, or gen 1, etc objects. Each partition leads to smaller contiguous regions.

In a 64-bit process, the address space restrictions are nearly gone, however, you may still not have enough contiguous, committable, java-allowed memory to satisfy the request. If you set Java to only allow a total of 2GB of memory, you may still have problems finding enough contiguous memory to satisfy the request.

Keep in mind that the process does need a sizable chunk of memory to store the code pages for your program, and needs memory for the java runtime. That alone may be a couple hundred megs of memory, depending on demands of the rest of your program.

It may be instructful to execute your simple program while it allocates a 1-element byte array, and inspect the memory with SysInternal's VMMap to get an idea of where your memory overhead comes from, excluding your large allocation.

Then give it a shot with your big allocation and see what you get.

antiduh
  • 11,853
  • 4
  • 43
  • 66
  • Based on the poster's description, this 2GB limit probably isn't relevant: they said it works with `java -Xmx2153M Main`, so the JVM process is able to allocate that much contiguous memory from the system. – Andrew Janke Jul 03 '14 at 18:11
  • @AndrewJanke - An GB limit is relevant if there isn't enough contiguous memory to allocate a 1.5 GB array, regardless of what is causing the GB limit. One such cause is 32-bit address space; another such cause is the manual Java memory limit options. I provide a response that addresses intermediate and root causes. – antiduh Jul 03 '14 at 19:07
2

jmap and jhat are good commands for discovering who's using what parts of memory. I recommend starting with a heap dump and looking at these. Only part of the available memory is allocated to the heap in Java. There is also memory needed to run the VM, and stack space. The heap is also divided into parts. The OutOfMemoryException is given when one part fills (the tenured generation). The heap analyzer tools will help you determine what exactly is going on.

For something quicker, you can also try checking these values before allocating your array:

Runtime.getRuntime().totalMemory();
Runtime.getRuntime().freeMemory();

Here are some more helpful links for more information about memory usage:

Community
  • 1
  • 1
Barett
  • 5,826
  • 6
  • 51
  • 55
1

The JVM memory space is divided into several areas.

Using the option -Xmx you set the size java heap, that for HotSpot is constructed with four spaces, Eden, Survivor 1 and 2 and tenured.

Thing to remember is that first tree refer to young space and rest is called old.

By default young space consume 1/3 of the -Xmx value.

Then mean when you declare -Xmx 2g. That young space will consume more then 600mb.

With such large data you could consider to use Direct ByteBuffer, described here by Peter:

IntBuffer arr = ByteBuffer.allocateDirect(size)
                            .order(ByteOrder.nativeOrder()).asIntBuffer(); 
 arr.put(n, 1);// arr[n] = 1
 arr.get(n);   // arr[n]

Java - Heap vs Direct memory access


To diagnose how java heap is consumed by your application on HotSpot the Oracle VM you can find tool delivered with SDK called jstat. This tool give you fast feedback about what is happening with your application.

In your case most interesting option for you would be gccapacity that provide data about Memory Pool Generation and Space Capacities and gcutil with Summary of Garbage Collection Statistics.

Thank to gccapacity you will find out what is the maximum capacity in KB of:

  • NGCMX - new generation (eden)
  • S0CMX - survivor space 0
  • S1CMX - survivor space 0
  • OGCMX - Maximum old generation
Community
  • 1
  • 1