My simulation tries to predict the demand on a system for a long period of time ... the output results in a very large 4D array (I use the 4 dimensions to minimise the chance of an error when the data is written to the array i.e. I can understand it better this way!).
The array size will be 25x4x3x20000 and I need it to be at least an (unsigned int) but I know that the stack can't handle this amount of data.
unsigned int ar[25][4][3][2000];
I have been looking around and found different solutions. However I am still undecided on which one to implement. So my question is: which one is better in term of performance and good practice:
- Use a vector of arrays: as described in stackoverflow.com/questions/18991765 ... But then any idea on how to convert for a 4D dimensions?
std::vector< std::array<int, 5> > vecs; vecs.reserve(N);
- Use a 4D vector and push_back(): I didn't use this because I know the final size of the array and I wanted to prevent many push_backs operations.
- Create the array on the heap: as described in stackoverflow.com/questions/675817
Any other suggestion is appreciated!