3

I'm using the Array.Copy Method (Array, Array, Int64). I've the following static method

public static T[,] Copy<T>(T[,] a)
    where T : new()
{
    long n1 = a.GetLongLength(0);
    long n2 = a.GetLongLength(1);
    T[,] b = new T[n1, n2];
    System.Array.Copy(a, b, n1 * n2);
    return b;
}

and the following code to test it

double[,] m1 = new double[46340, 46340];
double[,] m2 = Copy(m1); // works

double[,] m3 = new double[46341, 46341];
double[,] m4 = Copy(m3); // Arrays larger than 2GB are not supported
                         // length argument of Array.Copy

I'm aware of <gcAllowVeryLargeObjects enabled="true" /> and I've set it to true.

I saw in the documentation of Array.Copy Method (Array, Array, Int64) that there is the following remark for the length argument.

A 64-bit integer that represents the number of elements to copy. The integer must be between zero and Int32.MaxValue, inclusive.

I cannot understand why there is this limit for the length argument when the type is Int64? Is there a workaround? Is it planned to remove this limit in an upcoming version of .net?

Wollmich
  • 1,616
  • 1
  • 18
  • 46

1 Answers1

2

A workaround could be the following code

public static T[,] Copy<T>(T[,] a)
    where T : new()
{
    long n1 = a.GetLongLength(0);
    long n2 = a.GetLongLength(1);
    long offset = 0;
    long length = n1 * n2;
    long maxlength = Int32.MaxValue;
    T[,] b = new T[n1, n2];
    while (length > maxlength)
    {
        System.Array.Copy(a, offset, b, offset, maxlength);
        offset += maxlength;
        length -= maxlength;
    }
    System.Array.Copy(a, offset, b, offset, length);
    return b;
}

where the array is copied in blocks of the size Int32.MaxValue.

Wollmich
  • 1,616
  • 1
  • 18
  • 46
  • @AdamHouldsworth I've tested with double (size is 8) and it looks like it works even when one array block is about 16 GB. – Wollmich Dec 15 '17 at 13:42
  • @AdamHouldsworth but OP is using `gcAllowVeryLargeObjects`, so there is no 2 GB limit. – Evk Dec 15 '17 at 13:47
  • @wollmich Oh sorry I see, there is no issue with the array size, just there appears to be a contrived limitation in the current `Copy` implementation. – Adam Houldsworth Dec 15 '17 at 13:47
  • @AdamHouldsworth I tried it as well now with `Guid[,] m1 = new Guid[46341, 46341]` and `Guid[,] m2 = Copy(m1)`. This seems to work as well. So the exception message `Arrays larger than 2GB are not supported` isn't right either. – Wollmich Dec 15 '17 at 13:54
  • 3
    Arrays have a limitation on the number of elements. [The source seems to protect this](https://referencesource.microsoft.com/#mscorlib/system/array.cs,2b9709b0037280d3). I can't allocate `byte[] b = new byte[int.MaxValue - 1]` even with large objects enabled. Are multi-dimensional arrays considered one array in the runtime or many arrays? That might be masking the per-object storage limitation, but the array element count limitation still exists I think. Looks like `Copy` doesn't cope with multi-dimensional arrays? – Adam Houldsworth Dec 15 '17 at 13:55
  • @AdamHouldsworth Multidimensional arrays can have up to 4294967295 elements, see https://stackoverflow.com/questions/47830510/2d-array-with-more-than-655352-elements-array-dimensions-exceeded-supported. – Wollmich Dec 15 '17 at 15:31
  • @Wollmich According to the docs that is the same as a regular array then, which stipulates a max element count of `UInt.MaxValue`. Either way, the copy method seems to constrain it even further, might be worth posting a question / bug to MSFT git pages. – Adam Houldsworth Dec 15 '17 at 15:52