-1

I have a database query which is returning around ~200000 records. I have the below code :

List<Object> list = query.list();

When the code hit this line, in the browser in debug mode(F12), I see java.lang.OutOfMemoryError: GC overhead limit exceeded.

I need to know what can I do to overcome this issue.

rimonmostafiz
  • 1,341
  • 1
  • 15
  • 33
T Anna
  • 874
  • 5
  • 21
  • 52
  • https://stackoverflow.com/questions/37335/how-to-deal-with-java-lang-outofmemoryerror-java-heap-space-error-64mb-heap – Pankaj Gadge Jan 18 '18 at 06:57
  • Don't load so many records. There's rarely a valid reason to have hundreds of thousands of records in the memory at the same time. – Kayaman Jan 18 '18 at 07:01
  • use jvm options to increase the memory size, change your query, why do you need so many records at once, either use aggreate functions on server, or read certain amounts of them at a time, and do your thing, then move on to the next ones, iteratively. – hazimdikenli Jan 18 '18 at 07:02
  • Can't you just read the query results bit by bit? – Sweeper Jan 18 '18 at 07:07
  • What about using an optimum `fetchSize` while querying database? – Himanshu Bhardwaj Jan 18 '18 at 07:09
  • `200000` is not a big number. Of course, it depends on the actual size of the objects, but let’s assume 100 bytes each, then, we’re talking about ~20MB. A quarter century ago, Java’s default heap limit was 64MB. Today, we have heaps of several GB… – Holger Jan 18 '18 at 10:33
  • @PankajGadge that was ten years ago. That old limit should not have any relevance today. – Holger Jan 18 '18 at 10:37
  • Yes, thats right.This is a production issue which has to be fixed immediately so, i wanted a remedy. Ofcourse as a prevention, we would be rewriting the code to fetch records in chunks. – T Anna Jan 18 '18 at 10:58

1 Answers1

1

The following are some possible "band-aid" solutions. The real solution is to do a proper investigation of the cause of your excessive memory usage, and develop a considered fix for it.

Bandaid solution #1: Increase the heap size. Keep increasing the heap size until the OOMEs stop.

Bandaid solution #2: If you call query.setMaxResults(count) before calling list() you should get at most count elements in the list. If count is small enough you should get no OOMEs.

Bandaid solution #3: Don't retrieve the values to a list. Instead, do something like this instead:

Iterator<Object> iter = query.iterate();
while (iter.hasNext()) {
    Object row = iter.next();
}

Note that the OOMEs could actually be a symptom of a (different) memory leak. None of the above band-aid solutions will work for that ... in the long term.

Stephen C
  • 698,415
  • 94
  • 811
  • 1,216
  • 2
    Solution #4: use a profiling tool or heap analyzer to determine what actually causes the big memory consumption. – Holger Jan 18 '18 at 14:03
  • Agreed. Though the OP seems to want a quick band-aid. – Stephen C Jan 19 '18 at 02:08
  • Thanks much for the options. – T Anna Jan 19 '18 at 04:29
  • I have a few doubts: 1) Regarding solution 2 ,if i set setMaxResults(100), and then call query.list(), how will i fetch the remianing objects? I have to prepare a list from the resultset and send it to over to another function. 2)Regarding solution 3, again, i have to send a list to anther piece of code, i can't have the iterator fetching each object one by one. – T Anna Jan 19 '18 at 10:06
  • 1) You run the query again. 2) You change the other piece of code to accept an `Iterator` instead of a `List`. Alternatively implement a *lazy* list by wrapping the `Iterator`. (That will only work if the other code only makes one pass through the list.) – Stephen C Jan 19 '18 at 12:34
  • But if your algorithms only work efficiently if you have the entire resultset in memory (as a list, for example) then you need a big enough heap ... or alternative algorithms. – Stephen C Jan 19 '18 at 12:36