5

I've noticed, that sometimes, when memory is nearly exhausted, the GC is trying to complete at any price of performance (causes nearly freeze of the program, sometimes multiple minutes), rather that just throw an OOME (OutOfMemoryError) immediately.

Is there a way to tune the GC concerning this aspect?

Slowing down the program to nearly zero-speed makes it unresponsive. In certain cases it would be better to have a response "I'm dead" rather than no response at all.

java.is.for.desktop
  • 10,748
  • 12
  • 69
  • 103
  • 1
    An `Error` is a failure in the JVM. Why should the JVM throw an error (OOME) if there isn't any? – Vineet Reynolds Jul 15 '11 at 10:39
  • If you care about performance, you should probably ensure the JVM has enough memory to do its job *before* trying to fine-tune the GC. Either add more cheap RAM, or track down the memory leak that is consuming it all.... – mikera Jul 15 '11 at 11:41
  • You would have to be careful in the code that `catch`es the OOME. Because it is a `VirtualMachineException` that can be thrown at *any* time (not just as the result of a `new`), when you catch it your objects may be in aninconsistent state. See http://stackoverflow.com/questions/8728866/no-throw-virtualmachineerror-guarantees – Raedwald Jan 09 '12 at 13:15

4 Answers4

8

Something like what you're after is built into recent JVMs.

If you:

  • are using Hotspot VM from (at least) Java 6
  • are using the Parallel or Concurrent garbage collectors
  • have the option UseGCOverheadLimit enabled (it's on by default with those collectors, so more specifically if you haven't disabled it)

then you will get an OOM before actually running out of memory: if more than 98% of recent time has been spent in GC for recovery of <2% of the heap size, you'll get a preemptive OOM.

Tuning these parameters (the 98% in particular) sounds like it would be useful to you, however there is no way as far as I'm aware to tune those thresholds.

However, check that you qualify under the three points above; if you're not using those collectors with that flag, this may help your situation.

It's worth reading the HotSpot JVM tuning guide, which can be a big help with this stuff.

Cowan
  • 37,227
  • 11
  • 66
  • 65
2

I am not aware of any way to configure the Java garbage collector in the manner you describe.

One way might be for your application to proactively monitor the amount of free memory, e.g. using Runtime.freeMemory(), and declare the "I'm dead" condition if that drops below a certain threshold and can't be rectified with a forced garbage collection cycle.

The idea is to pick the value for the threshold that's large enough for the process to never get into the situation you describe.

NPE
  • 486,780
  • 108
  • 951
  • 1,012
  • -1 as the previous answer exactly explains how to fine tune the GC – Angel O'Sphere Jul 15 '11 at 12:19
  • @Angel O'Sphere: Thanks for taking the time to explain the reason for the downvote. However, please note that the "previous" answer has been posted *after* mine. – NPE Jul 15 '11 at 12:20
  • oki, I add a +1 again for sake of fairness ;D however "if you are not aware" of the options modern JVMs hav you should perhaps restrain from posting. Active monitoring e.g. needes to be done so often in a thread that it might be a slow down cause in itself again ... (oops, cant undo the -1) – Angel O'Sphere Jul 15 '11 at 12:38
  • @Angel O'Sphere: No worries. For everyone's benefit, I think it's best if we all focus on making a constructive contribution instead of going about commenting on everybody else's efforts. – NPE Jul 15 '11 at 12:54
1

I strongly advice against this, Java trying to GC rather than immediately throwing an OutOfMemoryException makes far much more sense - don't make your application fall over unless every alternative has been exhausted.

If your application is running out of memory, you should be increasing your max heap size or looking at it's performance in terms of memory allocation and seeing if it can be optimised.

Some things to look at would be:

  • Use weak references in places where your objects would not be required if not referenced anywhere else.
  • Don't allocated larger objects than you need (ie storing a huge array of 100 objects when you are only going to need access to three of them through the array lifecycle), or using a long datatype when you only need to store eight values.
  • Don't hold onto references to objects longer than you would need!

Edit: I think you misunderstand my point. If you accidentally leave a live reference to an object that no longer needs to be used it will obviously still not be garbage collected. This is nothing to do with nulling just incase - a typical example to this would be using a large object for a specific purpose, but when it goes out of scope it is not GC because a live reference has accidentally been left elsewhere, somewhere that you don't know about causing a leak. A typical example of this would be in a hashtable lookup which can be solved with weak references as it will be eligible for GC when only weakly reachable.

Regardless these are just general ideas off the top of my head on how to improve performance with memory allocation. The point I am trying to make is that asking how to throw an OutOfMemory error quicker rather than letting Java GC try it's best to free up space on the heap is not a great idea IMO. Optimize your application instead.

mogronalol
  • 2,946
  • 8
  • 38
  • 56
  • -1 ... you seem not to know how a GC works, suggesting WeakReferences wont help in anyway, but only increase the memory foot print. The same is true for your "don't hold onto references" statement ... there is no popint in nulling something out "just in case" in fact it is a common misunderstandign of GC to do so. – Angel O'Sphere Jul 15 '11 at 12:22
  • ah thanx for downmodding ;D you are insulted that I modded you down and explained why? ah well my other comment in the other answer got modded down as well,l so perhaps it wa someone else – Angel O'Sphere Jul 15 '11 at 14:37
  • I don't have the ability to downmod people! Please see my edit :) – mogronalol Jul 15 '11 at 15:46
  • Your clarification makes sense ofc ;D However I had assumed the asker had done this already and is now stuck. If he is still on the hunt for a memory leak I would suggest to try profiling the application. – Angel O'Sphere Jul 20 '11 at 15:10
0

Well, turns out, there is a solution since Java8 b92:

-XX:+ExitOnOutOfMemoryError
When you enable this option, the JVM exits on the first occurrence of an out-of-memory error. It can be used if you prefer restarting an instance of the JVM rather than handling out of memory errors.

-XX:+CrashOnOutOfMemoryError
If this option is enabled, when an out-of-memory error occurs, the JVM crashes and produces text and binary crash files (if core files are enabled).

A good idea is to combine one of the above options with the good old -XX:+HeapDumpOnOutOfMemoryError

I tested these options, they actually work as expected!

Links

See the feature description

See List of changes in that Java release

java.is.for.desktop
  • 10,748
  • 12
  • 69
  • 103