Depending on how the OS and (perhaps) the program respond to low memory situations, wasting memory might make the program slow or unstable. No guarantees.
On a modern OS a large but mostly-unused dynamic array of int
should have very little or no impact. Most of the unused part of the array will only ever be assigned virtual memory space, it will never be backed by RAM or swap. 64 bit OSes (which you must be using if you're talking about 32GB of RAM) won't be short of virtual address space until you use up 248 bytes with these things.
The reasoning behind this, is to avoid using the vector class, due to
the big performance hit in intense applications.
There likely is a big performance hit for creating a vector<int>
larger than you need, since it will be initialized whereas this array is uninitialized. If that's what you mean, then your code should cause no more instability than a huge vector, and possibly less because the memory is never touched.
If it wasn't for that then there should be no big performance hit from vector with optimization enabled. So you could for example work around it by using a vector of struct UninitializedInt { int value; UninitializedInt() {} };
to ensure no-op default construction. You might like to add an int
constructor and/or an operator int()
to make user's lives easier (prevent typing .value
all over the place), although that leads to ambiguous arithmetic operators so it's not a slam-dunk.
Or perhaps you could use reserve()
to allocate space for the vector and then resize()
or push_back()
or insert()
as needed. If that ends up with you in effect checking the bounds or modifying the size on every access then you would just replace one performance hit with another, of course.
Your code should work though. Provided you don't need to re-implement too much of the interface of vector
it might be the lowest-hassle way to eliminate that initialization overhead. Of course you need to make sure you free it properly. For example:
std::unique_ptr<int[]> array_(new int[N]);