Ok so I'm working on an application in Unity, which is highly memory dependent. Due to this I'm storing the data as two byte words inside of byte arrays. I also need every 4000 words or so to be instantly accessible to be moved around and modified without needing to copy data, so instead of using a single massive byte array I'm keeping an array of simple objects containing only one 8000 byte array each (The size of these subunits of data is significant and can't really be changed).
So here's the problem: I expect each object to be less than 9000 bytes big (giving ample space for the administrative overhead of the NET framework). Yet when I actually test the size of these simple objects, they consume approximately 12000 bytes each?! I understand the need for overhead in memory management but this 30% increase just isn't acceptable. The memory efficiency of whatever data structure I use is very important, as the application will need to have as much as 20 GB of data loaded into the memory at once. A 30% overhead means that the application won't be able to run on my computer.
So where is the memory being used and is there a way for me to avoid this large amount of overhead? I'm afraid I don't understand the NET backend well enough to have any idea what's going on. I've already looked over the following thread, but I don't see anything there that should explain the amount of overhead I'm experiencing: C# Object Size Overhead
While I suppose its possible that Unity is responsible for the overhead, I'm not sure if it is as I'm not using any Unity libraries or modules on this specific functionality besides NUnit for testing.
Here's an example class that I'm using to store the data:
public class dataObject {
public byte[] data;
public dataObject() {
this.data = new byte[8000];
}
}
And this is the code I'm using to test the size of the class in memory:
long mem1;
long mem2;
int N = 10000; // number of objects to create, a larger number gives a more accurate approximation of object size
mem1 = GC.GetTotalMemory(true);
dataObject[] dataObjectArray = new dataObject[N];
for (int i = 0; i < N; i++) {
dataObjectArray[i] = new dataObject();
}
mem2 = GC.GetTotalMemory(true);
long dataObjectArraySize = (mem2 - mem1) / N;
[Edit]:
I tried using a struct rather than an object for storing the data without any significant change. Here's my struct implementation (which as 3Dave points out, makes more sense to be used than an object instance):
public struct dataStruct {
public byte[] data;
public dataStruct() {
this.data = new byte[8000];
}
}