8

I'm new to C++, and I'm studying 'compressive sensing' so I need working with huge matrices, and MATLAB is actually slow so I programmed my algorithm with C++.

The thing is that I store big arrays (around 100Mb-1Gb). They are 20 arrays approx. and it works fine with 30 Gb of memory however when the process needs more than 40Gb it just stops. I think it's a memory problem, I tested it on Linux and Windows (OS 64 bits - compilers 64 bits MinGW - 200Gb Ram - intel Xeon) is there any limitation?.

size_t tm=n*m*l;
double *x=new double[tm];

I use around 20 arrays like this one. n,m ~= 1000 and L ~= 30 those are typically sizes.

Thank you

Tay2510
  • 5,748
  • 7
  • 39
  • 58
  • 10
    You might have 200 GB total memory, but you're asking for 40 GB of *contiguous* memory? I'm not surprised you can't do that. Break up your allocation into multiple pieces. – Greg Hewgill May 12 '15 at 02:55
  • *"when the process needs more than 40Gb it just stops"* - so you're getting a `std::bad_alloc` exception? You could try doing the largest allocations as soon as early as possible - that might reduce fragmentation. Are your memory usage measurements based on the requests in the source code, or as seen in some utility like top or Task Manager? Because looking in a utility may give insights into whether there's a memory leak.... – Tony Delroy May 12 '15 at 03:09
  • 4
    @GregHewgill: But it's only contiguous *virtual* memory, though. Surely the OS will break up the mapping to physical memory however it sees fit? – Oliver Charlesworth May 12 '15 at 06:23
  • Any chance that you have a memory leak in your program? – Peter Petrik May 12 '15 at 07:30
  • 1
    How do you not have 40 GB contiguous memory_ in an address space 16 Exabytes big?! There are 4 billion gigabytes, you would need a minimum of 100 million interspersed blocks to fragment it. – MSalters May 12 '15 at 08:00
  • Why everyone assumes it's 40GB of contiguous memory? Each array is independently contiguous. – SomeWittyUsername May 12 '15 at 08:38
  • 1
    Even if you do assume 40GB of contiguous address space, this should not be an issue on a 64-bit machine, nor should physical RAM be an issue given that you want 40GiB and the machine has 200GiB. Surely, neither fragmentation nor physical memory can plausibly explain why this fails. – Damon May 12 '15 at 09:10

1 Answers1

3

20 arrays, a problem with 40 GB memory use in total - that suggests that the program breaks when an array exceeds 2 GB. This should not happen, a 64 bits address space should use a 64 bits size_t for object sizes. It appears that MinGW incorrectly uses a 31 bit size (i.e. losing a sign bit as well).

I don't know how you allocate memory, but this is perhaps fixable by bypassing the broken allocation routine and going straight to the OS allocator. E.g. for Windows you could call VirtualAlloc (skip HeapAlloc, it's not designed for such large allocations).

MSalters
  • 173,980
  • 10
  • 155
  • 350