0

So I have a double array of 15,000,000 elements that during runtime, random subsets of 2000 elements from the array need to be extracted for processing.

I've tried initialising the array using the following:

static const double myArray[15000000] = {-2.1232, -6.4243, 23.432, ...};

However during runtime I get the error "C1060 compiler is out of heap space". In Visual Studio 2019, I've went into the project properties -> linker -> System and modified the Heap Reserve Size to "8000000000" which I assumed would be large enough and I have 16GB on my machine, but I still return the same error. I've also tried using the x64 compiler but to no avail.

I've also tried writing the array to a csv, and then binary file and reading from that instead during runtime. However, the read process takes far too long, as I'm required to read from it, ideally, several times a second.

I'm relatively new to C++, but especially new when it comes to memory allocation. What would you suggest as a solution?

trincot
  • 317,000
  • 35
  • 244
  • 286
Frank
  • 1
  • 1
    why not load the array from a file? – seleciii44 May 19 '20 at 08:01
  • Please include a [mcve]. – 463035818_is_not_an_ai May 19 '20 at 08:03
  • 1
    Why are you hard-coding a staggering number of values like this? Typically this is read from a file into a `std::vector` at run-time. The cost should be inconsequential. – tadman May 19 '20 at 08:03
  • 5
    That sounds like the compiler ran out of memory while trying to compile the application. Likely has something to do with the gigantic array. – rid May 19 '20 at 08:03
  • Why do you need to read from a file several times per second? Just load it once, at startup. Reading the data from an external file shouldn't take longer than it takes to load it from the executable. – molbdnilo May 19 '20 at 08:17
  • @seleciii44 I've tried that but the read process takes too long and the array still needs to be initialized. – Frank May 19 '20 at 08:18
  • @molbdnilo Wouldn't the array still require initialization in order to read the values into? I guess the crux of the problem is how do I store this array in memory even if I'm reading it from a file? – Frank May 19 '20 at 08:20
  • 2
    Use a pointer instead of an array. Allocate the array dynamically. Load a binary file into it. – molbdnilo May 19 '20 at 08:21
  • arrays guaranty consequence memory layout, it can be that you dont have enough at your heap – yaodav May 19 '20 at 08:59

2 Answers2

2

If you have your 15M doubles in binary format, you can embed that into your binary and reference it. The run-time cost is just a bit more disk IO when first loading your binary, but that should be much faster than parsing a CSV.

Botje
  • 26,269
  • 3
  • 31
  • 41
-2

the problem may be that you have enough memory but it not a consequence. so my suggestion is to use std::list

yaodav
  • 1,126
  • 12
  • 34
  • 1
    An `std::list` of doubles on a 64-bit platform takes up at least three times the size of an array or vector. The linked list is the most overrated data structure and is very rarely a good choice. – molbdnilo May 19 '20 at 10:57
  • 1
    yes but it can use fragmented memory so it can use more heap space – yaodav May 19 '20 at 10:59
  • It is also horribly, horribly slow. And the issue for the poster is not heap fragmentation at runtime, it's that the compiler runs out of memory. – molbdnilo May 19 '20 at 11:00
  • you right about the run time, but he also said that he increase the amount of memory and the error continue. and also sizeof(double) and sizeof(double*) are both 8 so id doesn't reduce the size of the list to change it to pointers – yaodav May 19 '20 at 11:09
  • He increased *the compiler's* heap size - not the size of his own proram - and the difference between `sizeof(double[15000000])` (which is 120000000) and `sizeof(double*)` (which is 4 or 8) is huge. – molbdnilo May 19 '20 at 11:13
  • so I don't understand your suggestion, he needs the data do be accessible in run time at relative fast and need 2000 values from that array every process, so how double* instead of double[15000000] help him? – yaodav May 19 '20 at 11:23
  • It helps because a) the compiler does not need to handle all that memory during compilation and linking, so the program will compile, and b) accessing a dynamically allocated array is as fast as accessing a static one. – molbdnilo May 19 '20 at 11:41
  • so you say to do this? double* arr = new Double[15000000]? – yaodav May 19 '20 at 12:00