1

Use example in link, but change to use char * and vector:

#include <vector>
using namespace std;

class Test{
    char *myArray;

public:
    Test(){
        myArray = new char[10];
    }

    ~Test(){
        delete[] myArray;
    }   
};  


int main(){
    vector<Test> q; // line 1
    Test t;         // line 2
    q.push_back(t);
}

It will cause double free or corruption error. However, if run line 2 before line 1, like:

Test t;
vector<Test> q;

Then it runs ok. Why is that?

Tested on Xubuntu 12.04 g++ 4.6.3.

Updated:

It's not duplicate question. I understand a copy constructor and an assignment operator are needed (it's already answered in the above link where the sample code is from). However, using int * or queue like that in the original link, but swapping line 1 and line 2, there is still error. Only using char *, and vector and swapping line 1 and line 2 will not cause error. My question is why is this particular case? Can anyone check it in your platform?

Community
  • 1
  • 1
user2847598
  • 1,247
  • 1
  • 14
  • 25

5 Answers5

8

Your type manages resources (a dynamically allocated array) but does not implement the rule of three. When you do this:

q.push_back(t);

q makes a copy of t, which it owns. So now you have two copies of an object referring to the same data, and attempting to call delete in it.

You need to implement a copy constructor and an assignment operator. Or use a class that manages its own resources, such as std::string or std::vector.

Calling delete[] on an already deleted array is undefined behaviour (UB). This means that sometimes your program might seem to work. You cannot rely on a program with undefined behaviour to do anything. Swapping lines 1 and 2 inverts the order in which t and q get destroyed. This seems to yield a different result on your platform, but both are UB.

juanchopanza
  • 223,364
  • 34
  • 402
  • 480
3

C++ automatically makes a shallow copy of your Test object when you push it to the vector. When the vector goes out of scope and is destructed, the myArray pointer is delete[]d. Then, when Test goes out of scope, the same pointer is delete[]d again.

You should specify a copy constructor and make a deep copy of the object. Along with a new array.

Overloading assignment operator (explained in the same link as above) is also strongly suggested.

Dariusz
  • 21,561
  • 9
  • 74
  • 114
  • The only thing that is strongly suggested is using `std::vector` or `std::unique_ptr` instead of implementing everything yourself. The latter requires more effort for no reason and leads to error-prone and duplicate code. –  Jan 03 '14 at 13:06
1

Rule of three (http://en.wikipedia.org/wiki/Rule_of_three_%28C%2B%2B_programming%29). There is likely an assignment going on that you don't see between two Test objects, so the old pointer from one object is being naively assigned to the new object.

Joe
  • 41,484
  • 20
  • 104
  • 125
0

Because you need a copy constructor.

push_back copies the argument and as you haven't provided a copy constructor, there's a default one, making a shallow copy (copy just the pointer, without the content *)

So, you need to define a

Test( const Test& other )
{
    // ...
}
Test& operator=( const Test& other ) // just in case
{
    // ...
}

and make a deep copy, manually copying the char* buffer.

*) which leads to double deletion - once from the t's destructor, once from the q's destructor (which calls the destructors of all elements in the vector)

Kiril Kirov
  • 37,467
  • 22
  • 115
  • 187
0

This is because the Test objects are automatically created and deleted when managed by vector and when they are inserted by push_back. Imagine that when new elements are added to the vector, more space is needed and allocated and the existing elements are copied to new addresses. Which means they are are deleted and their dynamic memory deallocated. To be able to overcome this, define copy constructor for your class that will make a deep copy of the object. Or use smart pointers such as unique_ptr or shared_ptr (from boost or C++11).