-1

I'm currently running my system against a rather large dataset and am getting the error. 'Out of memory. Java Heap Space'.

Is there anyway to get around this or is it just a case of the dataset is too large and can't be used?

trincot
  • 317,000
  • 35
  • 244
  • 286
user3469624
  • 25
  • 1
  • 8
  • 2
    Not enough information. Is it a file you're trying to load into memory? If so, try a bufferedReader. – Patrick J Abare II May 07 '14 at 16:41
  • 1
    question is too vague, you need to hold data in heap according to heapsize you can allocate, – jmj May 07 '14 at 16:41
  • 3
    You aren't giving us much to go on. But generally speaking, there are multiple approaches for working with large datasets. – pamphlet May 07 '14 at 16:41
  • 2
    You can increase the amount of memory available to the JVM by passing the `-Xmx` (e.g. `-Xmx4G` for max 4GB) argument at startup. – awksp May 07 '14 at 16:42
  • 2
    See the question and answers here: http://stackoverflow.com/questions/3030263/increasing-the-jvm-maximum-heap-size-for-memory-intensive-applications – Tassos Bassoukos May 07 '14 at 16:42
  • How do we even know if your data set is not really the problem ? For all we know, it could be a 10mb file and you are just creating too many objects in an infinite loop. – Erran Morad May 07 '14 at 17:32
  • Try this code to get the same error - `public class OutaMem { public static void main(String [] args){ String str = "Amnesia..."; StringBuilder sb = new StringBuilder(""); System.out.println("I lost my memory..."); while(true){ sb.append(str); } } }` – Erran Morad May 07 '14 at 17:40
  • I know the dataset is the issue because I was running my system with a slightly smaller version of this and it was completely fine and as soon as I plugged this dataset in I got the memory error. Its a huge matrix with 300 dimension and around 26,000 words so I'm not surprised this has happened to be honest – user3469624 May 07 '14 at 19:28

2 Answers2

2

In general, you can either

  • give it more memory e.g. increase the maximum heap size, but don't give it more than about 90% of main memory. BTW the default is 25% of main memory up to 32GB.
  • optimise the code so that it uses less memory, e.g. use a memory profiler. You can use a more efficient data structure or load portions of data into memory at a time.
  • break up the data so it own works on a portion at a time.
Peter Lawrey
  • 525,659
  • 79
  • 751
  • 1,130
0

If it's not the dataset that's eating up memory, it could be that you are not freeing up objects once they are inactive.

This is typically due to keeping references to very large objects or to lots objects laying around long after they are no longer needed. This is most likely references that are static variables, but it can also be references to large temporary variables (e.g., largeStringBuilderobjects) within methods that are still active.

David R Tribble
  • 11,918
  • 5
  • 42
  • 52