-2

I'm doing some tests on Java heap with IntelliJ and I get this error:

java.lang.OutOfMemoryError: GC overhead limit exceeded.

I know this happens when GC takes too much time, but I can not understand why does GC happens so frequently and takes so much time.

Design:

In the main thread, I create new objects repeatly, then put them in a map. To prevent these objects collected, I will print these objects after putting all of them into the map.

In the monitor thread, I print current free heap size every time I create a new object.

Runtime heap size configuration:

-Xms5m -Xmx5m

Test code:

import java.util.HashMap;
import java.util.Map;
import java.util.concurrent.locks.Condition;
import java.util.concurrent.locks.ReentrantLock;

public class Example {

    public static ReentrantLock lock = new ReentrantLock(true);
    public static Condition writeCondition = lock.newCondition();
    public static Condition monitorCondition = lock.newCondition();

    public static void main(String[] args) throws Exception {
        new Thread(new Monitor(lock, writeCondition, monitorCondition)).start(); // start a monitor thread
        Map<Integer, Object> map = new HashMap<>();
        for (int count = 0; count < 10000000; count++) {
            lock.lock();
            try {
                // every time create a new Object, I will use monitor thread print current free heap size
                monitorCondition.signal();
                map.put(count, new Object());
                writeCondition.await();
            } finally {
                lock.unlock();
            }
        }
        for (Map.Entry entry : map.entrySet()) {
            // keep reference to these Objects, so they will not be garbage collected.
            System.out.println(entry.getKey());
            System.out.println(entry.getValue());
        }
    }
}

class Monitor implements Runnable {

    ReentrantLock lock;
    Condition writeCondition;
    Condition monitorCondition;

    public Monitor(ReentrantLock lock, Condition writeCondition, Condition monitorCondition) {
        this.lock = lock;
        this.writeCondition = writeCondition;
        this.monitorCondition = monitorCondition;
    }

    @Override
    public void run() {
        int count = 0;
        while (true) {
            lock.lock();
            try {
                writeCondition.signal();
                long heapFreeSize = Runtime.getRuntime().freeMemory();
                System.out.println(count + "  times run monitor : ");
                System.out.println("heapFreesize : " + heapFreeSize);
                System.out.println();
                count++;
                monitorCondition.await();
            } catch (Exception e) {
                e.printStackTrace();
            } finally {
                lock.unlock();
            }
        }
    }
}

Result:

60005  times run monitor :
heapFreesize:603896
Exception in thread "Thread-0" java.lang.OutOfMemoryError: GC overhead limit exceeded
    at java.nio.CharBuffer.wrap(CharBuffer.java:373)
    at sun.nio.cs.StreamEncoder.implWrite(StreamEncoder.java:265)
    at sun.nio.cs.StreamEncoder.write(StreamEncoder.java:125)
    at java.io.OutputStreamWriter.write(OutputStreamWriter.java:207)
    at java.io.BufferedWriter.flushBuffer(BufferedWriter.java:129)
    at java.io.PrintStream.newLine(PrintStream.java:545)
    at java.io.PrintStream.println(PrintStream.java:696)
    at hello.Monitor.run(Example.java:58)
    at java.lang.Thread.run(Thread.java:748)
xingbin
  • 27,410
  • 9
  • 53
  • 103
  • 1
    In this code you are trying to fit 10 million objects (8 bytes each, typically) into a 5Mb heap (without even counting overhead from the runtime and the rest of the elements you use). That is *not* going to work. – gpeche Feb 16 '18 at 09:32
  • @gpeche Yes, in fact I expect `OutOfMemoryError`. But I can not figure out why GC takes so much time. Cause the objects I create should not be garbage collected. – xingbin Feb 16 '18 at 09:33
  • Interesting read: https://plumbr.io/outofmemoryerror/gc-overhead-limit-exceeded – assylias Feb 16 '18 at 09:35
  • @user6690200 Read https://stackoverflow.com/questions/1393486/error-java-lang-outofmemoryerror-gc-overhead-limit-exceeded, which references http://www.oracle.com/technetwork/java/javase/gc-tuning-6-140523.html – gpeche Feb 16 '18 at 09:36

2 Answers2

3

When available memory gets low, the GC tries to free some memory by collecting unreachable objects in the heap. To check if an object is unreachable is not an easy task and takes some time.

When all (or most) of your objects are still referenced somewhere, this is wasted time since no memory is reclaimed. Thus, when the JVM recognizes that a substantial amount of runtime is spent running the GC instead of doing something useful, the program is terminated with an OutOfMemoryError.

Henry
  • 42,982
  • 7
  • 68
  • 84
  • Seems JVM is not smart enough to find all objects I create can not be garbage collected quickly, it just repeatly check these objects. Am I right? – xingbin Feb 16 '18 at 09:41
  • 1
    Yes, this is correct. However, the problem is inherently difficult; so it's not so much an issue of being not smart. – Henry Feb 16 '18 at 09:44
  • @user6690200 what do you mean with “JVM is not smart enough”? You are filling a map with newly created objects until the memory is full. This has nothing to do with smartness. There are no objects to collect. – Holger Feb 16 '18 at 13:01
  • @Holger I mean, does JVM spend too much time on finding unreachable objects. While in fact, there are no unreachable objects. – xingbin Feb 16 '18 at 13:04
  • @Holger And, there seems still has free heap space when OutOfMemoryError thrown, I expect this error not be thrown until there is no free heap space... – xingbin Feb 16 '18 at 13:08
2

In JVM argument you have specified -Xms5m -Xmx5m so, heap size is set to 5MB i.e. 5242880 bytes. Now since you have executed the loop 10000000 times hence you are allocating 10000000 objects of type Integer and Object in Map variable map. Now if you look at features then you will realize that you have given 0.524288 bytes on average for a single entry in map. But as you know that the size of Integer object is more than 4 bytes(Since size of int is 4 bytes and Integer wraps the int type).

So in short you have not allocated enough memory to JVM heap that's why you are getting the exception java.lang.OutOfMemoryError: GC overhead limit exceeded..

Note: After looking at following output from your progarm, I realized that a single entry is taking 848 bytes of memory(see calculation below). So you will need at least 10000000 * 848 = 8480000000 bytes i.e. 8088 MB of memory (either physical or virtual memory):

heapFreesize : 32344

4699761  times run monitor : 
heapFreesize : 31496

4699762  times run monitor : 
heapFreesize : 30648

4699763  times run monitor : 
heapFreesize : 29800

4699764  times run monitor : 
heapFreesize : 28952

4699765  times run monitor : 
heapFreesize : 28104

4699766  times run monitor : 
heapFreesize : 27256

4699767  times run monitor : 
heapFreesize : 26408

4699768  times run monitor : 
heapFreesize : 25560
...

Try with some more allocated heap to the JVM and you will not get this error. See here and here for more info regarding Java HotSpot VM Options.

Calculation: In the provided output (posted by me above), the available memory is being printed. Now if you do calculation then you will find out that the difference between memory after adding one entry to the map is 848. Hence I can say that, a single entry(not single Object, as there are 2 objects- Integer class and Object class) in map is consuming 848 byres e.g. 32344 - 31496 = 848; 31496 - 30648 = 848; 30648 - 29800 = 848 and so on.

Also note that the JVM will increase-decrease it's heap size, as per it's need, if you have not set -Xms and -Xmx switches with the same value(as I didn't used these switches while executing the sample run).

Note: Your code seems logically incorrect because the use of writeCondition.signal(); and monitorCondition.await(); are not in sync. The program is awaiting infinitely. I think it is getting stuck at the end.

cse
  • 4,066
  • 2
  • 20
  • 37
  • Thank you. I changed the loop to 60000, which is small enough to be allocated into 5MB space. I still get the same error. Is there any way I can lower the limitation on the time GC takes? – xingbin Feb 16 '18 at 09:58
  • @user6690200 For `60000` entries in `map` variable, you need at least `48.6 MB` memory. So I'd suggest to set the `JVM` parameter as `-Xms60m -Xmx60m`. _Note that 48.6 MB is the memory required by `map` to hold the entries. Some more memory will be required by `JVM` to perform it's other tasks._ – cse Feb 16 '18 at 10:03
  • Thank you very much! I found the test results may get different when using 60000 and 5MB heap size, sometimes all 60000 entry can be put into map, sometimes fails. May I ask, why do you say a single object need `848` bytes? – xingbin Feb 16 '18 at 10:17
  • @user6690200 In the provided output (posted by me in answer), the available memory is being printed. Now if you do calculation then you will find out that the difference between memory after adding one entry to the `map` is `848`. Hence I can say that, a _single entry(not single Object, as there are 2 objects- `Integer` class and `Object` class) in `map` is consuming `848 byres`_. Also note that the `JVM` will increase its heap size, if needed, if you have not set `-Xms` and `-Xmx` switches the same value(as I didn't used these switches while executing the sample run). – cse Feb 16 '18 at 10:24