3

I am getting on this command

Dictionary<UInt64, int> myIntDict = new Dictionary<UInt64, int>(89478458);

this error:

System.OutOfMemoryException was unhandled  HResult=-2147024882
Message=Array dimensions exceeded supported range.
Source=mscorlib
StackTrace:
   at System.Collections.Generic.Dictionary`2.Initialize(Int32 capacity)
   at System.Collections.Generic.Dictionary`2..ctor(Int32 capacity, IEqualityComparer`1 comparer)

On 89478457 there is no error. Here is the source of Initialize in Dictionary.cs:

    private void Initialize(int capacity)
    {
        int size = HashHelpers.GetPrime(capacity);
        ...
        entries = new Entry[size];
        ...
    }

When I reproduce this, the error happens on the array creation. Entry is a struct in this case with size 24. When we get max int32 (0x80000000-1) and divide on 24 = 89478485 and this number is between prime numbers 89478457 and 89478503.

Does this mean, that array of struct cannot be bigger as maxInt32/sizeOfThisStruct?

EDIT:

Yes. I actually go over 2 GB. This happens, when the dictionary creates the internal array of struct Entry, where are the (key,value) pairs stored. In my case the sizeof(Entry) is 24 bytes and as value type is inline allocated.

And the solution is to use the gcAllowVeryLargeObjects flag (thank you Evk). Actually in .net core the flag is the environment variable COMPlus_gcAllowVeryLargeObjects (thank you svick).

And yes, Paparazzi is right. I have to think about, how not to waste memory. Thank you all.

Community
  • 1
  • 1
Mottor
  • 1,938
  • 3
  • 12
  • 29

2 Answers2

10

There is known limitation of .NET runtime - maximum object size allowed on the heap is 2 GB, even on 64-bit version of runtime. But, starting from .NET 4.5 there is configuration option which allows you to relax this limit (only on 64-bit version of runtime still) and create larger arrays. Example of configuration to enable that is:

<configuration>
  <runtime>
    <gcAllowVeryLargeObjects enabled="true" />
  </runtime>
</configuration>
svick
  • 236,525
  • 50
  • 385
  • 514
Evk
  • 98,527
  • 8
  • 141
  • 191
  • 1
    Does this work the same for .Net Core? I think .Net Core does not use app.config. – svick May 30 '16 at 17:42
  • Thanks for information. Seems your link answers your question. I've tested and without that option behavior seems the same (so it also throws OutOfMemoryException when allocating too big array). – Evk May 30 '16 at 17:53
  • @Evk This has nothing to do with 2GB limit. I am far away from 2GB. – Mottor May 30 '16 at 18:02
  • 1
    @Mottor No you are not far away from 2GB. It needs contiguous memory. – paparazzo May 30 '16 at 18:04
  • @Mottor just apply that configuration change mentioned and see how exception goes away. This limit is not related to amount of memory you have - applies even if you have terabytes. – Evk May 30 '16 at 18:10
0

On the surface Dictionary does not make sense
You can only have int unique values
Do you really have that my duplicates

UnInt32 goes to 4,294,967,295
Why are you wasting 4 bytes?

89,478,458 rows Currently a row is 12 bytes
You have 1 GB at about 83,333,333 rows
Since an object needs contiguous memory 1 GB is more of a practical limit

If values is really a strut 24
Then 1 gb in 31,250,000

That is just a really big collection

You can split is up into more than one collection

Or use a class as then it is just a reference with I think is 4 bytes

paparazzo
  • 44,497
  • 23
  • 105
  • 176
  • In the UInt64 is position in 3D Array (X,Y,Z) encoded, the value is the value in array. They are filled dynamic from the user (SQL) and the array is sparse. Then there are a lot of calculations. – Mottor May 30 '16 at 18:10
  • Then I suggest more efficient encoding as you are wasting a lot of space. – paparazzo May 30 '16 at 18:12
  • It is with bit mask encoded. In the reality the normal size is between 100000 and 500000, but I should know where is the limit, because I do not know anything about the data at compile time. – Mottor May 30 '16 at 18:15
  • So you don't know know the size. It won't be an exact number as contiguous memory is effected by other factors. You are wasting a lot of space and all you seem to care about is throwing more memory at it. – paparazzo May 30 '16 at 18:20
  • And can you tell me your proposal? How to encode 3 int in one key bellow int64 size and they are fast to encode? And if you see the value, it is int and not struct 24 bytes. What 24 byte struct is the internal implementation of C# dictionary (click on the link) Elem struct. – Mottor May 30 '16 at 18:26
  • Post a separate question. You really need three Int for position? If sparce then again wasting memory. Does not compute with 89478458 values - not even close. Again looks to me like you are wasting a lot of space and all you seem to care about is throwing more memory at it. – paparazzo May 30 '16 at 18:32