3

Are pointers to things that are allocated in other ways reasonably safe in C++?

Up to this point, I've been using STL containers (and in one case, an array, but that's another question) for all my dynamic memory needs, so I hadn't needed to explicitly use the new keyword. I've also been blithely using plain ol' int *foo type pointers to reference things. Now I'm reading about smart pointers (I cut my teeth on Java, so I never had to worry about this before) and the conventional wisdom seems to be "bare pointers are bad, don't use them."

So how much trouble am I in? Can I safely keep using bare pointers, so long as the things they point to have other destruction conditions? Is it something I can get away with, but should avoid in the future? Or is it a disaster in the making that I should go fix post-haste?

  • You should read up on exception safety, RAII, and look at two of the new smart pointers in C++11 (and have been in Boost for a while): `std::shared_ptr` and `std::unique_ptr`. And there are a lot of other similar questions, like: http://stackoverflow.com/questions/6675651/when-should-i-use-c-pointers-over-smart-pointers. Particularly, exception safety is notable to learn about and how to prevent related problems like memory leaks when exceptions are thrown and the stack is unwound. – wkl May 14 '12 at 19:27

5 Answers5

6

Bare pointers are safe per se, it is the incorrect usage of them that is dangerous (and you can get carried away easily). Smart pointers are nifty and all, but some (shared_ptr) involve reference counting, which incurs a performance penalty. You should try to use the smart pointers where applicable but AFAIK using pointers is not considered a horrible mistake.

You should be careful when referencing members of the STL containers as their addresses can change during relocation leaving you with strange bugs.

Mooing Duck
  • 64,318
  • 19
  • 100
  • 158
  • 3
    [unique_ptr](http://en.cppreference.com/w/cpp/memory/unique_ptr) and [scoped_ptr](http://www.boost.org/doc/libs/1_49_0/libs/smart_ptr/scoped_ptr.htm) have no performance overhead, and are necessary to avoid memory leaks in the face of exceptions. – Robert Cooper May 14 '12 at 19:30
  • Re: the last sentence in your answer - Daaaang. That sounds like something I would almost certainly have run into and _never_ figured out on my own. – Obliterax Scourge of Nations May 14 '12 at 20:51
  • 3
    @ObliteraxScourgeofNations: it is safe to take pointers to the node based containers: `array`, `list`, `map`, and `set`. The other containers may invalidate all pointers when you add (or reserve space for) elements. – Mooing Duck May 14 '12 at 22:43
  • @MooingDuck, it happens also when you remove items. In a vector, unless you remove the last item, all the items starting from the one you just removed will be moved "down" one notch. – Alexis Wilke Nov 21 '18 at 03:15
  • @AlexisWilke: That's not what the specification says. When you remove an item, all pointers are invalidated. That means the implementation could allocate a whole new array and move all of the items, rather than "moving them down one notch". – Mooing Duck Nov 21 '18 at 18:10
  • @MooingDuck, to move down one notch is probably the usual implementation, though. But that was not really my point. In your previous comment you only mentioned "add" and not "remove". Any modification of the container may end up invalidating the iterators. – Alexis Wilke Nov 22 '18 at 06:03
1

In a perfect world of people who write the code and people who maintain it don't make any mistakes, ever, raw pointers are amazing.

Unfortunately, that's not the case. First of all, bare pointers are error prone, point to some memory that can be invalidated without the pointer knowing about it, pointers can be aliased and the contents they point to changed.

We actually need smart pointers to make up for our "stupidity". At least something has got to be "smart" :).

Unless you're working on something very under the hood, there's no need to use raw pointers, simply because they're "not so smart". That being said, if you're very careful and people who use your code after you write it are very careful (which is more often than not not the case), then go ahead and use raw pointers, but other than that, use smart pointers, as they incur only little or no overhead.

unique_ptr<> has no overhead whatsoever until you move it in which case it writes one NULL into memory. On modern compilers this is frequently optimized out.

shared_ptr<> counts references and can incur a considerable amount of overhead particularly when used in multi-threaded apps, but this can be worked around, and so is not such a big dealbreaker.

All in all, there's no need to URGENTLY fix the raw pointers but they I think their usage is discouraged.

Mooing Duck
  • 64,318
  • 19
  • 100
  • 158
ScarletAmaranth
  • 5,065
  • 2
  • 23
  • 34
  • I would disagree with the 'perfect' world scenario, raw pointers don't obey RAII and aren't inherently exception safe. In a perfect world people would use them – 111111 May 14 '12 at 19:36
  • 1
    No upvote, because naked pointers are good for pointing to resources you don't own. (Well, you could use `boost::optional`, but why would you?) – Mooing Duck May 14 '12 at 22:49
  • Also pointing to things you don't own can bring some super fun times on occasion ;) – ScarletAmaranth May 14 '12 at 22:51
1

It is entirely accurate to say "Bare pointers are bad; don't use them" with a small addendum: " to point to things you have to clean up".

If you have an object and it's somebody else's responsibility to destroy it then a raw pointer is absolutely fine. However, the moment that you are responsible for destroying an object through any cleanup function, then use smart pointer always. In addition, for objects which you do not clean up, be aware of under what conditions they are cleaned up by another system- function locals, vector resizes, etc.

Rules of ownership:

  • No ownership: T*, and be aware of when you can no longer use it
  • Shared ownership: shared_ptr<T>, use custom deleter if necessary
  • Unique ownership: unique_ptr<T, Del>, custom deleter if necessary

Always follow these rules and you will never have any memory leaks, double frees, bad pointer accesses, or any similar memory-related bugs.

Puppy
  • 144,682
  • 38
  • 256
  • 465
  • There's surely a bit more to the rules than that! What about [leaking via unnamed `shared_ptr` temporaries](http://www.boost.org/doc/libs/release/libs/smart_ptr/shared_ptr.htm#BestPractices), or [`shared_ptr` cycles](http://www.boost.org/doc/libs/release/libs/smart_ptr/shared_ptr.htm#Introduction) to name a couple. – Fraser May 14 '12 at 23:11
0

Bare pointers are considered bad because it is easy to get into trouble with them. Smart pointers handle some of the issues for your automatically, and that makes them less prone to error.

When you have absolute control of all the code (ie, you are the only coder on the project), then using bare pointers is fine as long as you follow the basic memory allocation laws and customs ("whoever allocates the memory gets rid of it except when noted otherwise."). But when you work with other people on code (ie, projects with more than 1 coder), that opens the door to mistakes and misunderstandings.

Smart pointers take care of who owns an object (and therefore who should deallocate it). They can also track when the last code using that object no longer wants it around and therefore can safely deallocate it when allocated data is shared.

Smart pointers that reference count also yield safe default copy constructors and safe default copy assignment operators for those data members of your classes that allocate memory from the heap. They can safely set their reference counting smart pointers to the original object they are copying, and when either the original or the clone go out of scope/get deleted, the allocated memory they are managing and sharing through the smart pointer is left around for the other object that is still pointing at it. This isn't true with bare pointers. If you use bare pointers, you have to write copy assignment operators and copy constructors to clone an object that allocates memory, to prevent data corruption of the contained/owned allocated data.

StarPilot
  • 2,246
  • 1
  • 16
  • 18
  • smart pointers that reference count will yield safe default copy constructors, but 99% of the time, it won't do what you want it to do, and will act similar to raw pointers other than the double delete. – Mooing Duck May 14 '12 at 22:53
-2

Bare pointer are bad when you new and delete without worrying too much. This behaviour can lead you to very strange errors. What I suggest is, when you have to work with a couple of pointer and objects allocated on the heap, to learn using the memory leak checker Valgrind.

Then some simple rules, when you instanciate arrays with new [] then you always have to delete them with delete [] and conversely when you instanciate single objects with newthen always call delete.

Remember to avoid mixing new-deletewith malloc-free or with new[]-delete[] because those functions are not designed to work with each other, for example never do this:

int *a = (int*)malloc(10*sizeof(int));
delete a;

but this

int *a = new int[10];
delete[] a;

As Tibor said the use of pointer is not bad per se, but as always, "With great power comes great responsibility" :P

Mooing Duck
  • 64,318
  • 19
  • 100
  • 158
linello
  • 8,451
  • 18
  • 63
  • 109
  • Never instantiate anything with `new[]`, and never call a destruction function yourself. – Puppy May 14 '12 at 19:36
  • Why downvote? I agree with smart-pointers use but these are only useful hints for a "bare-pointer" user. – linello May 14 '12 at 19:40
  • 2
    I did not downvote, but smart pointers are necessary for exception-safe code. If any function between `new` and `delete` throws an exception, your program will never call `delete` and will leak memory. – Robert Cooper May 14 '12 at 19:48