I was profiling a C++/CLI project at work that had a strange memory signature: it had a huge amount of Commited bytes (7GB) but it only had about 30MB in the Working Set. What I discovered was that the problem lay in something like this piece of code:
ref class c1
{
public:
int x;
static void createArray()
{
auto val = gcnew c1;
auto arr = gcnew array<c1^>(1) { val };
auto jagged= gcnew array<array<c1^>^>{ arr };
//Make sure the array doesn't get optimized away
Console::WriteLine(jagged[0][0]->x);
}
};
I was expecting the initialization of jaged to translate to something like this in C#:
var jagged = new c1[][] { arr };
Instead, using ILSpy, I discovered it actually translates to something like this (decompiling the optimized binaries:
public static void createArray()
{
c1 val = new c1();
c1[] array = new c1[]
{
val
};
c1[][] array2 = new c1[][]
{
new c1[12648430] //the constant is equal to 0xC0FFEE
};
array2[0] = array;
Console.WriteLine(array2[0][0].x);
}
I find it very hard to understand why it would allocate a 12648430 element temporary array. However, I also find it hard to imagine such a bug actually was released in the Visual Studio 2013 Professional edition MSVC compiler, but I can't think of any other explanation.
Am I missing something?