0

I had encountered the problem

Exception in thread “main” java.lang.OutOfMemoryError: GC overhead limit exceeded 

I tried to fix this error as a solution is present here

Run->Run Configurations->Click on arguments->inside VM arguments type

-Xms1024M -Xmx2048M

Xms- for minimum limit

Xmx- for maximum limit

then also it showing the same error. what should I do?

current I am parsing 76GB json File. Keep this thing in mind

Arayan Singh
  • 3,192
  • 3
  • 16
  • 35
  • 1
    `-Xms1024M` `-Xmx2048M` mean 1G and 2G and you try to load a 76G file. Even with good GC call, it is not possible imo – Gilles-Antoine Nys May 17 '18 at 09:43
  • can you provide some suggestion how to solve this problem? – Arayan Singh May 17 '18 at 09:46
  • See following post: https://stackoverflow.com/questions/1393486/error-java-lang-outofmemoryerror-gc-overhead-limit-exceeded – tom May 17 '18 at 09:48
  • Can the source file be some how split? Can the parsing lib parse in chunks rather than loading the entire file? Can you use a (much) larger amount of memory for xmx – David May 17 '18 at 09:49

4 Answers4

0

Agree with @Gilles-Antoine Nys With a 2GB heap if you're trying to load the entire 76GB JSON file into memory you're putting a huge amount of pressure on the collector.

Keep in mind that if the source file is 76GB the in memory representation on the heap in Java will likely be much larger due to overheads of creating objects to represent each node etc and meta data associated with your parsing library.

If you're not loading the entire file at once but parsing you might have more luck but I presume this will depend a lot on how your underlying JSON parsing lib is working.

David
  • 7,652
  • 21
  • 60
  • 98
0

We'd have to see the code that you wrote to parse the file, but this kind of error is usually caused by trying to read the entire file at once instead of using a streaming approach.

Both Jackson and Gson (if you're using a json library, ofcourse) provide streaming functionality.

FrederikVH
  • 476
  • 2
  • 8
0

You will never be able to process a 76GB of data if you do not stream your file when processing it in Java.

You can stream your file with different solutions :

Ramzus
  • 46
  • 5
0

When I was handling Wikipedia dump data(which was more than 50GB)then I have changed VM arguments like this

Run->Run Configurations->Click on arguments->inside VM arguments type

-Xms1g -Xmx8g 

Xms- for minimum limit

Xmx- for maximum limit

It works for me

Note: I have 16GB RAM.

Arayan Singh
  • 3,192
  • 3
  • 16
  • 35