We have a java application that reads a chunk of data but keeps that data only for a short period of time. The data is stored in "simple" collections (HashMap
, HashSet
). These collections are cleared when the data is processed (so I call coll.clear()
and not coll=null
). The cycle (read-process-clear) continues, until "all chunks of data" are processed. After a certain amount of time, there will be "new chunks" and the whole thing starts again.
This process has run for several weeks on a server without any problem.
However, today, after a planned restart, it crashed over and over and again with an OutOfMemoryError: Java heap space
(and restarted automatically by a monitoring process).
I connected to the process with a remote debugger AND with the jvisualvm tool to try and find if (and where) I could have a memory leak. While the processing-thread was paused (by the debugger) right after the calls to clear()
, I forced a gc
with the jvisualvm tool. As I had expected, it cleared almost the entire heap (only 4MB used). Next cycles: same behavour, almost no usage of the heap after clear
, etc... In the end, the process did not go out of memory anymore!
To me, it looks like the garbage collector failed to work correctly...
- how can I verify if that's the case?
- if so, how can this be?
should I call
System.gc()
after theclear()
methods?But as far as I know (and read here), that would only be a "suggestion" to the VM; ánd the GC will always collect all possible garbage when the heap is almost full; ánd such a call should simply be avoided :-)...
(we're running Java 1.6.0_51-b11 in server mode on Solaris, no special GC-options)
EDIT after analyzing heap dumps:
Our code has this structure:
final DataCollector collector = ...
while (!collector.isDone()) {
final List<Data> dataList = collector.collectNext();
for (final Data data : dataList) {
// process data...
}
}
The OOMError
occurs while executing the collector.collectNext()
method.
It looks like the heap still contains the dataList
variable (and all Data objects) of the previous iteration of the while
loop!
Is it normal behavour that a local variable of a while loop does not get garbage-collected? If that's true, we have to give this process almost twice as much memory as strictly needed...
As a hack/check, I added a line dataList = null
after the for-loop, but this does not change the behaviour (still OOM
, heap dump still shows the same 'double assignment').
(I guess we were lucky that the process did not crash earlier.)