Before asking, I'll make a small disclaimer: Yes, I am aware differences between Virtual Memory, Physical Memory and Working Set. All the numbers below refer to the virtual memory.
The situation is as follows: We have a 32-bit C# app, which imports x86 C++ libraries (with plenty of native dependencies, so migrating to x64 is not an option at the moment). The app loads a large dataset through the unmanaged component, and then tries to display it (report).
However, when the dataset is particularly large, an OutOfMemory exception is thrown upon adding an item to the list, like in code below:
No surprise here. However, what is surprising is the fact that the application still has around 280MB free VM.
When debugging pure unmanaged apps (C++), it was never the case - the bad_alloc could be only achieved if there was no free VM left, or if there was no free address space chunk of sufficient size.
Hence, the question - what can be the reason of this? I understand how to fight the issue - the unmanaged components really do eat a lot of memory, as well as create a lot of fragmentation - but what is the reason for OutOfMemoryException appearing so early?
Code in question looks like this:
List<Cell> r = new List<Cell>(cols);
for (int j = 0; j < cols; j++)
{
r.Add(new CustomCell()); // The exception is thrown on this line
}
At the moment of exception, the list (in one occurence) had 85 items and it's capacity was 200-something (the number of columns, as indicated in constructor). So, the exception most likely happened in CustomCell allocation. CustomCell object has many fields, but is certainly less than 1KB in total. The 280MB of free memory are located in chunks from 64KB to 14MB - so there should be plenty of space to allocate it.