36

In C++ How to decide or know if a pointer was deleted before??

when i tried to delete a pointer that was previously deleted in another part of the code it threw an exception that can't be handled.

I was wondering if there is a way to check or try delete the pointer ? any reference about advanced memory operations.

also i want to master the un-handled exceptions of pointers and the access to protected or access is violation ,... this kind of error.

thanks for those who give some of their knowledge and their time to help other people and share their benfits


Update

The big advice from a lot of modern c++ developers community is - Use smart pointers or try to avoid the use of raw pointers. But for throw security and insuring free of memory (ISO_CPP_FAQ) and of course if you want to avoid the small overhead of using smart pointers[may not be noticeable always but they have overhead] you can write your custom methods that deal with raw pointers [type*] - this is not general. Prefer always smart pointers to raw pointers.

In 'Going Native 2013' a common advice given was - Never use raw pointers.

I'm Geeker
  • 4,601
  • 5
  • 22
  • 41
ahmedsafan86
  • 1,776
  • 1
  • 26
  • 49
  • 2
    Use smart pointers (`std::shared_ptr` and `std::weak_ptr`) – Andy Prowl Mar 31 '13 at 15:09
  • Thanks very much for the advice, but will this solve the problem from roots, no expected problems from the side of pointers will appear if i used smart pointers? – ahmedsafan86 Mar 31 '13 at 15:11
  • 1
    a) You don't delete pointers, you delete *objects*. b) Using invalid pointers doesn't throw exceptions; rather, it is *undefined behaviour*. – Kerrek SB Mar 31 '13 at 15:18
  • ok i delete the object by (delete ptr;) which is pointing at the object in memory, but when i try to access the member of class pointer variable and it was deleted, crashes happened and also when trying to delete a previously deleted one it crashes also and if debugging the visual studio throws unhandled exception – ahmedsafan86 Mar 31 '13 at 15:27

7 Answers7

36

There can be three solutions. You might want to choose one depending on the effort/quality ratio you want to acheive:

Elegant and most correct solution:

Use smart pointers and you do not have to manually call delete ever again. This is the best possible way to overcome this problem. It utilizes the principle of RAII which works perfectly for a language like C++ which does not have an in-built garbage collector.

Less elegant but workable solution:

Assign the pointer to NULL after deletion. Calling delete on a NULL pointer is a no-op so it removes the need to have that extra NULL check but this might hide some problems instead of making them visible.

Less elegant but more correct solution:

Hunt down all the multiple delete problems by letting your program crash. You might as well use memory analyzer programs like valgrind and then fix your code to avoid all these problems.

Community
  • 1
  • 1
Alok Save
  • 202,538
  • 53
  • 430
  • 533
  • 10
    None of the solutions you present is really general. Smart pointers only work is special situations. Setting the pointer you delete to null doesn't help much, since it doesn't affect other pointers, and hunting down the problem after the fact is very, very difficult. The only real solution is design, up front, so that the lifetime of the (very few) dynamically allocated objects is determined. – James Kanze Mar 31 '13 at 15:25
  • 5
    @JamesKanze: Its true that a well designed application will rarely face such problems and one should always design to strictly determine & limit object lifetimes. As much true that fact is the other bitter truth is that properly designed applications only exist in a ideal world, while most of us have to work with applications which already have these problems.You don't create them but inherit them. – Alok Save Mar 31 '13 at 15:30
  • If the application isn't properly designed, it won't work, regardless of what you do after the fact. – James Kanze Mar 31 '13 at 15:39
6

This is a good question, but one of the fundamental truths of working in a manually memory managed environment (like C/C++ and its cousins) is that there's no good way of looking at a pointer after the fact and asking whether it's valid-- once it's become invalid, it's gone, and looking at it is prone to blowing up. Your job is to make sure that it's never deleted or freed more than once, and never accessed after that time.

Definitely look at the smart pointers, which were invented to make programmer's lives easier in just these circumstances. (The more traditional method is to be careful, not screw it up, and then maybe assign NULL to the pointer when you know it's been deleted, as Alok says.)

Ben Zotto
  • 70,108
  • 23
  • 141
  • 204
  • This isn't unique in C++. Globally, Java doesn't give you an automatic solution either---it's garbage collection does allow you to implement one, but you still have to implement it yourself. (And of course, if you're doing anything critical in C++, you'll be using garbage collection as well, precisely so that you can reliably detect when you're accessing an already deleted object.) – James Kanze Mar 31 '13 at 15:39
4

In C++ How to decide or know if a pointer was deleted before??

The language standard does not offer any legal way to determine whether an arbitrary pointer is valid or not.

There's one way, but it's highly compiler/OS-specific. You can either hook into the existing memory manager or replace it with your own and provide a dedicated function for pointer validation. It may be not very easy to do, though. And you don't really want to rely on this functionality if performance is critical.

Alexey Frunze
  • 61,140
  • 12
  • 83
  • 180
3

use shared_ptr<> and shared_array<>, remember shared_ptr<> can be used to manage memory allocated to an array only if appropriate Deleter is provided, otherwise use shared_array<> to manage your arrays

A* a_tab=new A[100];
boost::shared_ptr<A> a_tab_ok(a_tab,ArrayDeleter<A>()); 

//only ok if

template <typename T>
    class ArrayDeleter
    {
    public:
        void operator () (T* d) const
        {
            delete [] d; //will delete array!
        }
    };

is provided

4pie0
  • 29,204
  • 9
  • 82
  • 118
  • one could think that shared_ptr<> by default is valid also for arrays: be careful though – 4pie0 Mar 31 '13 at 15:32
2

The pointer won't tell you anything. Your design should: if you're using dynamic allocation, it's normally because your application requires the object to have a specific lifetime, so you know when to correctly delete the object. If the object is copyable, or has a lifetime which corresponds to scope, you don't (normally) allocate it dynamically.

There are, of course, exceptions in very low level code—if you're implementing something like std::vector, you will have to use some sort of dynamic allocation, because the size isn't known at compile time. But such allocations shouldn't escape; it's the responsibility of the low level class to handle the memory.

Finally, buffer overruns, accessing already deleted memory, and the like are undefined behavior. They do not, in general, result in an exception, and there's not a generic way of handling them. (You can usually arrange to get a signal when such things occur, but there are so few things you can do from a signal handler, this doesn't really help much.) In general, what you want is for the program to crash, since you don't know what state it is in. In the rare cases where this is not the case, you have to fall back on implementation defined extensions, if they exist. If you compile with the /EHa option with VC++, for example, what would normally be a crash will be coverted into a C++ exception. But that's a VC++ extension, and you still don't know the overall state of the program when this occurs. If it's because you've corrupted the free space arena, there's probably not much you can do even if you catch the exception (and there's a good chance you'll get another exception from a destructor trying to free memory when you unwind the stack).

James Kanze
  • 150,581
  • 18
  • 184
  • 329
2

I know this thread is old. But if someone else is reading this, he should know about unique_ptr. shared_ptr has indeed an overhead. The counter is stored on the heap. Every time the counter is accessed, there is a risk of a processor cache mismatch. unique_ptr Is more limited but has no overhead in comparison to plain pointers. My suggestion is to prefer unique_ptr over shared_ptr when you do not need reference counting. Another important notice is, that unique_ptr works well with arrays. If i remember correctly this is also true for shared_ptr since C++17.

Martin Fehrs
  • 792
  • 4
  • 13
1

Smart pointer are better choice to avoid such problems (but you must have complete understanding before using them also), but I would like to mention performance limitations associated with Smart pointers, reason is they usually use atomic operations for example InterlockedIncrement in Win32 API for reference counting. These functions are significantly slower than plain integer arithmetic. I am not sure such little performance penalty acceptable in your case or not.

What i usually do is (so i don't have to spend days later on to debug nasty bugs), i spend lot of time on design, and object lifetime, before moving for actual coding, as i delete memory I specifically set pointer to NULL, it is good practice as far as i think. Again perhaps the real solution is to spend more time on determining dependencies and object life time before moving on!

Saqlain
  • 17,490
  • 4
  • 27
  • 33
  • 1
    Smart pointers don't necessarily help, and they can even aggravate the situation. – James Kanze Mar 31 '13 at 15:37
  • @Saqlain: InterlockedIncrement is an atomic operation and it doesn't have a noticeable effect on the performance, if so just using the plain pointers in the low level will do it, but the problem that make you always stick with the smart pointers is the no throwing exception guarantee. – ahmedsafan86 Dec 02 '13 at 21:07