I have to create a fairly large double array 12000ish x 55000ish. Unfortunately, I get an out of memory exception. I used to develop in Java and could change the memory settings. Is this possible with C# or is it just impossible? I am using VS 2008.
-
I would suggest you consider an "out of main memory" data structure (i.e. database). Why do you have to store such a large double array? – bitxwise Jan 27 '11 at 11:09
-
1I did chuckle at the choice of word "fairly" too. – Ray Booysen Jan 27 '11 at 11:09
-
No worries I will persist stuff in a database. – cs0815 Jan 27 '11 at 11:15
-
Maybe one could also solve the problem when setting the `LARGE_ADDRESS_AWARE`-PE-Flag – unknown6656 Jan 01 '16 at 11:02
4 Answers
Each double
is 8 bytes, so you're trying to allocate a single array with just over 5GB. The CLR has a per-object limit of around 2GB IIRC, even for a 64-bit CLR. In other words, it's not the total amount of memory available that's the problem (although obviously you'll have issues if you don't have enough memory), but the per-object size.
I suggest you split it into smaller arrays, perhaps behind a facade of some description. I don't believe there's any way to workaround that limit for a single array.
EDIT: You could go for an array of arrays - aka a jagged array:
double[][] array = new double[12000][];
for (int i = 0; i < array.Length; i++)
{
array[i] = new double[55000];
}
Would that be acceptable to you?
(You can't use a rectangular array (double[,]
) as that would have the same per-object size problem.)

- 1,421,763
- 867
- 9,128
- 9,194
-
1
-
1Virtual Memory is not the issue. You're trying to allocate 5GB of memory with a 2GB limit. :) – Ray Booysen Jan 27 '11 at 11:09
-
Also, due to memory fragmentation, OP might experience problems as soon as they try to allocate a contiguous block of several hundreds of Mb. – vgru Jan 27 '11 at 11:10
-
@Jon The array will be quite sparse (document by word matrix) so jagged array might be an option. have to implemnet a db/nhibernate solution anyway soon though as things get bigger. thanks for your input! – cs0815 Jan 27 '11 at 11:18
-
1@csetzkorn If the data is very sparse then perhaps they shouldn't be in an array at all. Use a `Dictionary
` to store the data. It's probably going to use less space so long as about half of the values are empty. – Servy Apr 04 '13 at 15:10 -
@Servy: Mapping each (x, y) to a single int? Yes, that's an option. I think I'd probably use a custom value type instead though. – Jon Skeet Apr 04 '13 at 15:12
-
@JonSkeet If there are sections of contiguous data separated by large sections of contiguous nothingness, then a jagged array would be great (essentially the model paged memory uses) If the large sections of nothingness/stuff aren't contiguous then it doesn't work so well. – Servy Apr 04 '13 at 15:14
-
@JonSkeet I assumed (incorrectly it seems) his data was single dimensional. If it's multi-dimensional then you'd want something like a `Dictionary
` instead, yes. – Servy Apr 04 '13 at 15:16 -
2@Servy: Given the acceptance, I suspect the OP was actually fine with the fact that the jagged array used quite a lot of memory. These days 5GB isn't *that* much :) – Jon Skeet Apr 04 '13 at 15:17
-
Is that 2 GB limit more because of the fact that it will always be difficult to get 2 GB continugous memory blocks in RAM or 2 GB memory space for a single object of an application simply looks unviable for 32 bit OS. I'm of the later opinion. 32 bit OS can address only 4 GB of memory. There is already 3 GB barrier while addressing main memory, then you have OS reserved space , then you have got running application space. I believe even if CLR had initially allowed to go over 2 GB per object it would have never been viable. – RBT Aug 17 '16 at 06:05
Since you can't create objects larger than 2GB you can try to use MemoryMappedFile to work with chunk of memory of the required size.
var data = MemoryMappedFile.CreateNew("big data", 12000L * 55000L);
var view = data.CreateViewAccessor();
var rnd = new Random();
for (var i = 0L; i < 12000L; ++i)
{
for (var j = 0L; j < 55000L; ++j)
{
var input = rnd.NextDouble();
view.Write<double>(i * 55000L + j, ref input);
}
}

- 5,234
- 24
- 31
Providing that your total memory is sufficient, you can prevent Out of memory exceptions resulting from LOH fragmentation by creating a bunch of smaller arrays, and wrapping them in a single IList<T>
, or some other indexed interface.
Here is a link which describes it:
BigArray<T>, getting around the 2GB array size limit
Credits: this post (C# chunked array).

- 49,838
- 16
- 120
- 201
Well either you are out of memory (close some programs) or you're hitting the memory allocation limit (about 2Gb), this memory needs to be a contiguous block. You could use a 64bit machine in which case you'll have more memory available or I think you can make the application large address aware (googling will tell you how to do this if it's possible in this case).
Believe you add a /3GB switch to the Boot.ini file for the large address awareness.

- 33,605
- 26
- 118
- 198