I have to operate with two-dimensional array(s) with size 34000x34000 items for math calculations.
Obvious problem - is impossibility of CLR to store such big parts of data in memory. I tried to use MemoryMappedFile
, but it fails too while I'm trying to create viewer for the object: MemoryMappedFile.CreateViewAccessor()
. Are there any other already existed ways to store large arrays? ('cause I havent' much time for trying to implement custom large data storage)
thanks
3 Answers
The <gcAllowVeryLargeObjects>
configuration element allows for over-2-Gb-arrays for 64-bit processes. You might give that a try.
Also, see if sparse arrays can help you:
A sparse array is an array in which most of the elements have the default value (usually 0 or null). The occurrence of zero-value elements in a large array is inefficient for both computation and storage.

- 113,561
- 39
- 200
- 288
-
Thanks, this solution suits best for me – dotFive Sep 18 '15 at 17:34
If you're using .NET 4.5, you can use the <gcAllowVeryLargeObjects enabled="true|false"/>
-config element. It configures the GC so that it allows objects larger than 2GB in memory. See this, section "My app works on large datasets (uses objects > 2GB)", respectively this.

- 702
- 6
- 15
As alternative to gcAllowVeryLargeObjects
consider to use jagged array instead - dealing with one huge piece of memory (I guess about 10GB for doubles) definitely require some additional effort.
YourType[][] array = Enumerable.Range(0, 34000).Select(_ => new YourType[34000]).ToArray();
Note that you definitely need x64 process to use such array - make sure to explicitly build you exe with x64 only.

- 98,904
- 14
- 127
- 179
-
I've tried, this code throws an OutOfMemoryException for me on a ~6000 element – dotFive Sep 18 '15 at 17:33
-
@dotFive I bet you've missed "need x64 process to use such array" remark. Anyway good that gcAllowVeryLargeObjects worked for you (also I don't see how one worked and another did not). – Alexei Levenkov Sep 18 '15 at 18:21