48

In C# 2008, what is the Maximum Size that an Array can hold?

RRUZ
  • 134,889
  • 20
  • 356
  • 483
DNR
  • 3,706
  • 14
  • 56
  • 91

7 Answers7

55
System.Int32.MaxValue

Assuming you mean System.Array, ie. any normally defined array (int[], etc). This is the maximum number of values the array can hold. The size of each value is only limited by the amount of memory or virtual memory available to hold them.

This limit is enforced because System.Array uses an Int32 as it's indexer, hence only valid values for an Int32 can be used. On top of this, only positive values (ie, >= 0) may be used. This means the absolute maximum upper bound on the size of an array is the absolute maximum upper bound on values for an Int32, which is available in Int32.MaxValue and is equivalent to 2^31, or roughly 2 billion.

On a completely different note, if you're worrying about this, it's likely you're using alot of data, either correctly or incorrectly. In this case, I'd look into using a List<T> instead of an array, so that you are only using as much memory as needed. Infact, I'd recommend using a List<T> or another of the generic collection types all the time. This means that only as much memory as you are actually using will be allocated, but you can use it like you would a normal array.

The other collection of note is Dictionary<int, T> which you can use like a normal array too, but will only be populated sparsely. For instance, in the following code, only one element will be created, instead of the 1000 that an array would create:

Dictionary<int, string> foo = new Dictionary<int, string>();
foo[1000] = "Hello world!";
Console.WriteLine(foo[1000]);

Using Dictionary also lets you control the type of the indexer, and allows you to use negative values. For the absolute maximal sized sparse array you could use a Dictionary<ulong, T>, which will provide more potential elements than you could possible think about.

Matthew Scharley
  • 127,823
  • 52
  • 194
  • 222
  • This is true, and I'll add some stuff about Lists now, but it's an interesting question, especially when you get into making your own indexable classes, you can use ie. `long` to provide more potential values, if that's an issue for you. – Matthew Scharley Sep 08 '09 at 02:46
  • 14
    I'm pretty sure that .NET also imposes a maximum object size of 2GB. This means that even if arrays used an `Int64` indexer, the max number of elements in a `byte[]` array would still be restricted to about `2^31`; The max number of elements in an `int[]` array would be about `(2^31)/4`; The max number of elements in a `long[]` array would be about `(2^31)/8` etc etc. – LukeH Sep 08 '09 at 03:18
  • Worthy of noting, if it does. I don't know one way or the other, but it strikes me as an implementation concern, one that might be lifted or changed as the amount of memory in computers increases. The limits I describe won't change, because they'd cause major changes in the way that the framework is composed. (ie, code that checks the max size of an array by using `Int32.MaxValue` will instantly be incorrect) – Matthew Scharley Sep 08 '09 at 03:23
  • 6
    The 2 GB limit is real. Please see this related question: http://stackoverflow.com/questions/1087982/single-objects-still-limited-to-2-gb-in-size-in-clr-4-0 – Brian Rasmussen Sep 08 '09 at 03:34
  • @Rob: Someone may be solving a problem of programming competition which may include a test of 150000 elements and during which it may occur to him/her that what could be the maximum value for an Array length. – Muhammad Mobeen Qureshi Oct 08 '13 at 07:53
  • 1
    Thanks for the idea of using Dictionary to handle sparsely indexed data. I have been thinking about optimizing our current array usage and this is a good approach. – Mr.Hunt Dec 26 '13 at 10:45
  • @Mr.Hunt: Just be a little careful with that construct; it will cause all your keys to get boxed. Probably not a huge issue, but still something to be aware of. – Matthew Scharley Jan 02 '14 at 03:42
  • 5
    This answer is plain wrong... First of all the max array size is less than `Int32.MaxValue` - as others have pointed out it is a magical number. This magical number constant depends on the .Net version used and whether the type stored is one byte in size or not. Secondly Dictionaries uses arrays internally meaning the Dictionary's internal structure has nothing to do with the type of its values or keys so creating a `Dictionary` does not magically break this limit. – AnorZaken Feb 01 '16 at 17:08
  • @MatthewScharley There's no boxing happening there. – Mike Marynowski Oct 16 '19 at 09:52
  • I am having this issue https://stackoverflow.com/questions/67202298/why-is-max-size-of-int-array-lower-than-int32-maxvalue anyone have any clues? – Saamer Apr 21 '21 at 19:31
  • Have to agree with @AnorZaken, this answer is demonstrably wrong. Here is my proof: `var arr = new int[int.MaxValue];` will throw a `System.OutOfMemoryException`. This goes for all collections that use an array internally too. – dyslexicanaboko Jun 20 '21 at 03:32
23

Per MSDN it is

By default, the maximum size of an Array is 2 gigabytes (GB).

In a 64-bit environment, you can avoid the size restriction by setting the enabled attribute of the gcAllowVeryLargeObjects configuration element to true in the run-time environment.

However, the array will still be limited to a total of 4 billion elements.

Refer Here http://msdn.microsoft.com/en-us/library/System.Array(v=vs.110).aspx

Note: Here I am focusing on the actual length of array by assuming that we will have enough hardware RAM.

Sai
  • 1,376
  • 2
  • 15
  • 25
14

This answer is about .NET 4.5

According to MSDN, the index for array of bytes cannot be greater than 2147483591. For .NET prior to 4.5 it also was a memory limit for an array. In .NET 4.5 this maximum is the same, but for other types it can be up to 2146435071.

This is the code for illustration:

    static void Main(string[] args)
    {
        // -----------------------------------------------
        // Pre .NET 4.5 or gcAllowVeryLargeObjects unset
        const int twoGig = 2147483591; // magic number from .NET

        var type = typeof(int);          // type to use
        var size = Marshal.SizeOf(type); // type size
        var num = twoGig / size;         // max element count

        var arr20 = Array.CreateInstance(type, num);
        var arr21 = new byte[num];

        // -----------------------------------------------
        // .NET 4.5 with x64 and gcAllowVeryLargeObjects set
        var arr451 = new byte[2147483591];
        var arr452 = Array.CreateInstance(typeof(int), 2146435071);
        var arr453 = new byte[2146435071]; // another magic number

        return;
    }
Chris Marisic
  • 32,487
  • 24
  • 164
  • 258
Anton K
  • 4,658
  • 2
  • 47
  • 60
4

Here is an answer to your question that goes into detail: http://www.velocityreviews.com/forums/t372598-maximum-size-of-byte-array.html

You may want to mention which version of .NET you are using and your memory size.

You will be stuck to a 2G, for your application, limit though, so it depends on what is in your array.

James Black
  • 41,583
  • 10
  • 86
  • 166
0

I think it is linked with your RAM (or probably virtual memory) space and for the absolute maximum constrained to your OS version (e.g. 32 bit or 64 bit)

waqasahmed
  • 3,555
  • 6
  • 32
  • 52
0

I think if you don't consider the VM, it is Integer.MaxValue

Peter Lee
  • 1,011
  • 1
  • 10
  • 11
  • 1
    If you're ignoring the VM, in principle it's Int64.MaxValue. Although everyone's saying that indexing is 32-bit, that's only true for code using one of the IL ldelem instructions (and to be ultra-pedantic, it's only true for VMs in which an IL native int is 32 bits.) But you don't have to use that instruction. Arrays offer an overload of GetValue that accepts a 64-bit argument, so in principle, this supports 64-bit indexing. The CLR does not support this, so in practice, the 2GB limit remains, but if you're ignoring the VM, then System.Array theoretically supports bigger. – Ian Griffiths Jan 18 '12 at 10:52
  • 1
    @IanGriffiths: aren't the .NET int and uint types are defined to be 32 bit always, independent of hardware architectures? Would it be allowed to change the size of an "IL native int"? – Thomas Weller Dec 06 '21 at 07:09
0

By default the .NET Framework runtime (CLR) has the maximum object size allowed in the GC Heap at 2GB, even on the 64-bit version of the runtime. Since arrays are just a special kind of managed type which are created within the managed heap they also suffer from this limitation. Since .NET 4.5 you can change application configuration file to enable arrays that are greater than 2GB.

Since .NET Core 2.0 arrays greater than 2GB are enabled by default.

Note that the array has also MaxLength limit which equals to 2147483591. This means you will be able to allocate 8GB of double, but not 8GB of byte.

var arr = new double[1000000000];     // OK
var bytes = new byte[1000000000 * 8]; // Error
Kuba Szostak
  • 356
  • 3
  • 5