19

I know that manual dynamic memory allocation is a bad idea in general, but is it sometimes a better solution than using, say, std::vector?

To give a crude example, if I had to store an array of n integers, where n <= 16, say. I could implement it using

int* data = new int[n]; //assuming n is set beforehand

or using a vector:

std::vector<int> data;

Is it absolutely always a better idea to use a std::vector or could there be practical situations where manually allocating the dynamic memory would be a better idea, to increase efficiency?

Vivek Ghaisas
  • 961
  • 1
  • 9
  • 24
  • You don't have to `push_back`. `std::vector(n)` is *almost* equivalent to your dynamic array version, except that the `n` integers are value, hence zero, initialized in the vector. – juanchopanza Mar 08 '13 at 12:58
  • @juanchopanza: Fair point. I removed the `push_back` part. It wasn't supposed to be part of the comparision. – Vivek Ghaisas Mar 08 '13 at 13:31

8 Answers8

16

It is always better to use std::vector/std::array, at least until you can conclusively prove (through profiling) that the T* a = new T[100]; solution is considerably faster in your specific situation. This is unlikely to happen: vector/array is an extremely thin layer around a plain old array. There is some overhead to bounds checking with vector::at, but you can circumvent that by using operator[].

us2012
  • 16,083
  • 3
  • 46
  • 62
  • 4
    The usual reason for using C style arrays has nothing to do with speed; it's for static initialization, and for the compiler to determine the size according to the number of initializers. (Which, of course, never applies to dynamically allocated arrays). – James Kanze Mar 08 '13 at 13:04
  • @James If I'm reading your comment correctly, you are objecting to the fact that I seem to be bashing C-style arrays without saying that I mean dynamically allocated ones? If so, I have edited my answer regarding this. (Also, +1 to your answer.) – us2012 Mar 08 '13 at 13:07
  • 1
    That clears it up. I didn't know that `vector`/`array` is a thin layer. I kinda assumed that with all the functionality, it must have a significant overhead. – Vivek Ghaisas Mar 08 '13 at 13:40
  • You said "It is always...until...solution is considerably faster". I didn't read it as being restricted to dynamic allocation. (As I said in my answer, I have _never_ used an array `new`. Before `std::vector` and `std::string`, the first thing one did was to write something equivalent.) But while I never use array `new`, there are cases where C style arrays are justified (some, but not all of which can be replaced by `std::array` in C++11). – James Kanze Mar 08 '13 at 14:25
10

I can't think of any case where dynamically allocating a C style vector makes sense. (I've been working in C++ for over 25 years, and I've yet to use new[].) Usually, if I know the size up front, I'll use something like:

std::vector<int> data( n );

to get an already sized vector, rather than using push_back.

Of course, if n is very small and is known at compile time, I'll use std::array (if I have access to C++11), or even a C style array, and just create the object on the stack, with no dynamic allocation. (Such cases seem to be rare in the code I work on; small fixed size arrays tend to be members of classes. Where I do occasionally use a C style array.)

James Kanze
  • 150,581
  • 18
  • 184
  • 329
4

If you know the size in advance (especially at compile time), and don't need the dynamic re-sizing abilities of std::vector, then using something simpler is fine.

However, that something should preferably be std::array if you have C++11, or something like boost::scoped_array otherwise.

I doubt there'll be much efficiency gain unless it significantly reduces code size or something, but it's more expressive which is worthwhile anyway.

Useless
  • 64,155
  • 6
  • 88
  • 132
4

You should try to avoid C-style-arrays in C++ whenever possible. The STL provides containers which usually suffice for every need. Just imagine reallocation for an array or deleting elements out of its middle. The container shields you from handling this, while you would have to take care of it for yourself, and if you haven't done this a hundred times it is quite error-prone.
An exception is of course, if you are adressing low-level-issues which might not be able to cope with STL-containers.

There have already been some discussion about this topic. See here on SO.

Community
  • 1
  • 1
bash.d
  • 13,029
  • 3
  • 29
  • 42
  • +1 for the link at the end, that should destroy once and for all the myth that accessing vector elements is somehow slow. – us2012 Mar 08 '13 at 12:43
3

Is it absolutely always a better idea to use a std::vector or could there be practical situations where manually allocating the dynamic memory would be a better idea, to increase efficiency?

Call me a simpleton, but 99.9999...% of the times I would just use a standard container. The default choice should be std::vector, but also std::deque<> could be a reasonable option sometimes. If the size is known at compile-time, opt for std::array<>, which is a lightweight, safe wrapper of C-style arrays which introduces zero overhead.

Standard containers expose member functions to specify the initial reserved amount of memory, so you won't have troubles with reallocations, and you won't have to remember delete[]ing your array. I honestly do not see why one should use manual memory management.

Efficiency shouldn't be an issue, since you have throwing and non-throwing member functions to access the contained elements, so you have a choice whether to favor safety or performance.

Andy Prowl
  • 124,023
  • 23
  • 387
  • 451
2

std::vector could be constructed with an size_type parameter that instantiate the vector with the specified number of elements and that does a single dynamic allocation (same as your array) and also you can use reserve to decrease the number of re-allocations over the usage time.

Zlatomir
  • 6,964
  • 3
  • 26
  • 32
1

In n is known at compile-time, then you should choose std::array as:

std::array<int, n> data; //n is compile-time constant

and if n is not known at compile-time, OR the array might grow at runtime, then go for std::vector:

std::vector<int> data(n); //n may be known at runtime 

Or in some cases, you may also prefer std::deque which is faster than std::vector in some scenario. See these:

Hope that helps.

Nawaz
  • 353,942
  • 115
  • 666
  • 851
  • Unless you know that `n` is very, very small, you probably shouldn't declare local variables as `std::array`. Unless there is some very specific reason for doing otherwise, I'd just use `std::vector`---if I know the size, I'll initialize the vector with the correct size. (This also supposes that the type has a default constructor.) – James Kanze Mar 08 '13 at 12:52
0

From a perspective of someone who often works with low level code with C++, std vectors are really just helper methods with a safety net for a classic C style array. The only overheads you'd experience realistically are memory allocations and safety checks for boundaries. If you're writing a program which needs performance and are going to be using vectors as a regular array I'd recommend to just use C style arrays instead of vectors. You should realistically be vetting the data that comes into the application and check the boundaries yourself to avoid checks on every memory access to the array.

It's good to see that others are checking the differences of the C ways and the C++ ways. More often than not C++ standard methods have significantly worse performance and uglier syntax than their C counterparts and is generally the reason people call C++ bloated. I think C++ focuses more on safety and making the language more like JavaScript/C# even though the language fundamentally lacks the foundation to be one.

Epic Speedy
  • 636
  • 1
  • 11
  • 25