0

I'm measuring the throughput of various parallel data structures. I essentially have various amounts of clients (1-16) performing read or write operations. When I have them perform only 10,000 operations, everything goes peachy; when they do more than that (100,000 or 1,000,000), I get the following error: java.lang.OutOfMemoryError: Java heap space

Any ideas how to fix this?

rhombidodecahedron
  • 7,693
  • 11
  • 58
  • 91
  • Have you already tried to [increase the heap space][1]? [1]: http://stackoverflow.com/questions/6452765/how-to-increase-heap-size-of-jvm – Joachim Rohde Nov 01 '11 at 16:22
  • increase the heap size (Xmx) ? – aishwarya Nov 01 '11 at 16:22
  • this is what happens when there is not enough memory... – bestsss Nov 01 '11 at 16:22
  • This is nowhere near enough info... obviously you are allocating memory per operation somehow that's still referred to somewhere in some data structure, but what exactly, it's impossible to say from this. You should provide some more detail about your code. – Sean Owen Nov 01 '11 at 16:23
  • Probably rather a leak than a heap that's too small. Have your jvm create a heapdump on OOM and investigate that heapdump with e.g. Eclipse MAT. – fvu Nov 01 '11 at 16:23

1 Answers1

0
  1. Does your program have multiple components?
  2. Can you split them and make them asynchronous, using bounded/blocking data structures or make them event driven?
  3. What is the effect of setting -Xmx VM arguments?

We'll need more description for a better answer.

Rohit
  • 1,710
  • 5
  • 20
  • 29