1

So in the past, .NET Framework always had a maximum array size of UInt32.MaxValue even if you set gcAllowVeryLargeObjects in web.config.

This was always terrible because, even on a 64-bit machine you couldn't make a large array.

So the question is: Did Microsoft finally fix this in .NET Core latest version / .NET 5?

Any help would be greatly appreciated, the documentation on this issue is not good. It's been 20 years, hopefully they finally fixed this.

A X
  • 905
  • 2
  • 13
  • 31
  • 1
    The maximum number of array elements in C# is effectively 2^31, which is int.MaxValue not unit.MaxValue. That hasn't changed - the type that you use for an array index is `int` so this is not likely to ever change. However, each element in the array can be bigger than one byte so the total size of the array can far exceed 2GB. (Note that an Array *can* have 4GB elements, but that's not so easily accessed from C# using an integer index.) – Matthew Watson Sep 27 '20 at 20:34
  • 3
    If you're trying to exceed a maximum [you're probably doing something wrong](https://devblogs.microsoft.com/oldnewthing/20070301-00/?p=27803). Computers today can store petabytes of data. [Tell us why](https://meta.stackexchange.com/questions/66377/) you think you need an array to do that. – Dour High Arch Sep 27 '20 at 20:39
  • 4
    @DourHighArch This kind of attitude is why Microsoft is famous for building products that don't scale. "Oh if you want to use lots of RAM in an array you must be doing something wrong." Other platforms and programming languages do not have this kind of attitude, and thus have more adoption. It's 2020 - there are a million reasons why you would want to store lots of data in a large array. Does Google and Amazon think like this? No, they don't. No explanation is needed for this because the reasons are obvious to anyone. – A X Sep 27 '20 at 20:56
  • @DourHighArch The limit should be 2^64 not 2^32 in a 64-bit world, and other languages do not have this problem. – A X Sep 27 '20 at 20:58
  • At this point you should think about using a db, sqlite will do – Nekuskus Sep 27 '20 at 21:09
  • 1
    @Nekuś Thanks for the suggestion, but in this case we are trying to use .NET to build a very large cacheing cluster and due to these kinds of limitations we may have to switch to using Java, GoLang or some other language that doesn't have these kinds of limits (which is really unfortunate) – A X Sep 27 '20 at 23:33
  • 3
    @Abr no one forces you to pick C#. It is super strange so that you've mentioned Java which generally have the same limitation (https://stackoverflow.com/questions/3038392/do-java-arrays-have-a-maximum-size). You should be good with C++ so (https://stackoverflow.com/questions/216259/is-there-a-max-array-length-limit-in-c). – Alexei Levenkov Sep 28 '20 at 06:47

1 Answers1

5

The array size is limited to a total of 4 billion elements, and to a maximum index of 0X7FEFFFFF in any given dimension (0X7FFFFFC7 for byte arrays and arrays of single-byte structures).

Taken from Remarks

Adrian
  • 8,271
  • 2
  • 26
  • 43
  • 1
    Thank you! This is too few items in my point of view. Microsoft should be doing away with limits like this. The max number of array elements should obviously be 2^64 not 2^32 in a 64-bit world. – A X Sep 27 '20 at 20:57
  • 1
    Adding to this, if you look at the MSDN page for gcAllowVeryLargeObjects (.Net 4.5+), the 'new' limit is clearly defined as "UInt32.MaxValue" with the max size for single dimension arrays being 2,147,483,591 when the flag is set allowing for >2Gb array sizes. (https://learn.microsoft.com/en-us/dotnet/framework/configure-apps/file-schema/runtime/gcallowverylargeobjects-element) – Reahreic Mar 22 '22 at 11:52