8

I have a small C++ program that requires a large 2d array.

int distanceArray[282][9900000];

I am aware that a standard 32 bit console application would be unable to allocate this much memory to my process due to the 2GB cap on 32 bit applications. However, despite the fact that I have configured my compiler to use the native amd64 toolset, I still get the same error when I try to build my application:

Error   C2148   total size of array must not exceed 0x7fffffff bytes

I have 16GB of RAM on my system, so I know that my physical memory should not be an issue. If my calculations are correct, this should only take up around 3GB. However, I don't understand why I can't seem to get the 64 bit compiler to work correctly. I have followed the directions on Microsoft's website that outline how to use the 64 bit compiler, but no matter what I do, i receive the same error. Any help would be greatly appreciated. Thank you!

I am aware that this question has been asked before, but the existing answers have not been helpful to me. Thank you for any attempt at helping me find my problem.

  • 1
    You're probably overflowing your stack. That's not the same as the total available memory of a process. Use a `std::vector>` or something instead. – πάντα ῥεῖ Feb 11 '17 at 06:30
  • 1
    Wouldn't that usually cause an error at runtime, though? I can't even get this to compile properly. – Mitchell Augustin Feb 11 '17 at 06:32
  • That seems to be a compiler specific thing then. I can't reproduce [here](http://coliru.stacked-crooked.com/a/4bfd4b70ce80f332). You probably should add tags and explanation about your specific development environment. – πάντα ῥεῖ Feb 11 '17 at 06:35
  • Either way, if you did get it running in this state, it would most certainly overflow the stack extremely badly. – chris Feb 11 '17 at 06:40
  • [Here's why](http://stackoverflow.com/a/40272921/17034), use new[] to allocate it. – Hans Passant Feb 11 '17 at 07:26

2 Answers2

7

The 64-bit PECOFF executable format used on Windows doesn't support creating executables that have a load size of greater than 2GB so you can't create statically allocated objects bigger than that. You run into a similar problem if you try create such an object on the stack using an automatically allocated object.

One possible solution would be to dynamically allocate the object:

int (*distanceArray)[9900000] = (int (*)[9900000]) calloc(282, 9900000);

Or if you want it more C++'ish and don't need it to be zero initialized like a statically allocated array would be:

int (*distanceArray)[9900000] = new int[282][9900000];
Ross Ridge
  • 38,414
  • 7
  • 81
  • 112
  • 2
    Ouch, why not suggest a `std::vector>`. It's tagged C++ after all. – MSalters Feb 11 '17 at 12:29
  • 1
    @MSalters Because my example is almost exactly the same thing as `int distanceArray[282][9900000];` except allocated dynamically and so can be used as a "drop-in" replacement in most contexts. If you want to make it more C++'ish then you could use `new[]` instead, otherwise not knowing from the question whether a vector of vectors would be acceptable alternative I went the most direct replacement. Also using a vector of vectors had been already suggested by πάντα ῥεῖ, so I'd only be repeating something that I had presumed the original poster was already aware of. – Ross Ridge Feb 11 '17 at 19:25
  • 1
    There is also one more 'thing' defining the largest possible array. You can not have an array where indexes are bigger than `INT_MAX` (see ``). Dynamic or static. In C or C++. –  Sep 16 '18 at 15:52
3

As suggested by MSalters, an std::vector<std::vector<int>> was definitely the way to go.

For anyone who is still having this problem, here is how I initialized it:

std::vector<std::vector<int>> distanceArray(282, std::vector<int>(9000000, -1));

9,000,000 columns are created within every row of 282 items, and each value is initialized to -1 at the start.

Thanks to everyone who commented for the help!