0

I have the following method:

public static void createGiantArray(int size) {
    int[][][] giantArray = new int[size][size][size];
}

When I call it with a size of 10,000 like so:

createGiantArray(10000);

I get the following error:

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space

How can I create an array that is 10,000 x 10,000 x 10,000 while bypassing the memory exception?

(I am aware my current method is pointless and loses scope. I didn't post the extra code that goes with it.)

Evorlor
  • 7,263
  • 17
  • 70
  • 141
  • 1
    You can either change your array type for a more dynamic type (to increase only when needed) or increase your java memory changing the execution parameters e.g. Xmx and Xms values – Jorge Campos Mar 24 '15 at 17:19
  • Just as a side note: `10,000^3 == 1,000,000,000,000`, i.e. ~ `931 GB` of memory. Really? – dhke Mar 24 '15 at 17:23
  • 2
    @dhke Exactly my thoughts, and each integer is 4bytes, so its 931 * 4 GB actually! – jbx Mar 24 '15 at 17:25
  • is it any reason why you need so big array? what you are using it for? can you refactor it? – user902383 Mar 24 '15 at 17:43

3 Answers3

0

Not sure if you're realising that a 3 dimensional array of 10,000 by 10,000 by 10,000 integers (4 bytes each) amounts to 3725Gb! You can increase your heap size with the -Xmx flag, but that amount is still humungous.

You need to change the design of your program.

jbx
  • 21,365
  • 18
  • 90
  • 144
  • The problem was intended to be memory intensive. I am trying to create a problem which needs to be solved. But I did not mean to make it that intensive. Thanks! – Evorlor Mar 24 '15 at 17:25
  • It's even worse, because it's a 3d-array of ints so there would need to be room for the object pointers for the sub-arrays of ints, too. – dhke Mar 24 '15 at 17:30
0

Basically you can't mantain in memory a structure of that dimension. Each int occupy 4 bytes. So you have 10.000 x 10.000 x 10.000 x 4 bytes = 4.000.000.000.000 bytes that is around 4 terabyte. You need to mantain your data on a database (quite big database) and access them as needed. A secondary possibility is to mantain the data in a distributed memory system like a distributed hashtable over many servers. For example 4000 servers with 1 gigabyte of ram.

Davide Lorenzo MARINO
  • 26,420
  • 4
  • 39
  • 56
0

If the data is going to be relatively sparse, then you can use a different data structure, like described here or here.

Community
  • 1
  • 1
Jeff Evans
  • 1,257
  • 1
  • 15
  • 31