I'm working on a program used to process pictures of circuits close to a 1:1 micrometer:pixel ratio, so I've got quite a few elements in various vectors that are dynamically allocated (via constructor for CImg). Outside of that, I only have a few Qt widgets allocated.
CImg<unsigned char> image(this->image.width(), this->image.height(), 1, 3, 0);
This is what sets it off.
CImg(const unsigned int size_x, const unsigned int size_y,
const unsigned int size_z, const unsigned int size_c, const T& value) :
_is_shared(false) {
const unsigned long siz = (unsigned long)size_x*size_y*size_z*size_c;
if (siz) {
_width = size_x; _height = size_y; _depth = size_z; _spectrum = size_c;
try { _data = new T[siz]; /*thrown here*/ }
catch (...) {
_width = _height = _depth = _spectrum = 0; _data = 0;
throw CImgInstanceException(_cimg_instance
"CImg(): Failed to allocate memory (%s) for image (%u,%u,%u,%u).",
cimg_instance,
cimg::strbuffersize(sizeof(T)*size_x*size_y*size_z*size_c),
size_x, size_y, size_z, size_c);
}
fill(value);
}
else { _width = _height = _depth = _spectrum = 0; _data = 0; }
}
image.width()
and image.height()
in the constructor call are around 25000 and 900, respectively. That makes siz
somewhere in the vicinity of 66 million. So that's about 66MB worth of unsigned chars being allocated.
Googling has given me a bunch of results that suggest memory fragmentation. At its peak usage, the program uses up 2GB. Surely Windows can find a spot for 66MB in the remaining >6GB of memory and this can't be memory fragmentation, right? That said, what else could it be?
I'll add that this only happens after compiling in debug mode, and not after compiling in release mode.