0

I have come across a bizarre problem and I have created this simple program in order to demonstrate it. I know the code itself doesn't make much sense, but I would like to highlight something I don't understand.

#include <iostream>
using namespace std;

class tempClass{
public:
    float n;
    tempClass(){};
    ~tempClass(){} //VERY IMPORTANT LINE
    void* operator new[](size_t size);
};


void* tempClass::operator new[](size_t size){
    tempClass* a;
    a= ::new tempClass[2];
    for (int i = 0; i < 2; i++)
    {
        a[i].n = i*10;
    }
    cout << a << endl;
    return a;
}


int main(){
    tempClass* a;
    a = new tempClass[2];
    cout << a << endl;
    cout << a[0].n << endl;
    return 0;
}

In the code, I have overloaded the operator new for the class that I have created. However, the behaviour of the function changes, depending on whether I include the destructor of the class or not. I have noticed, that if I don't include the destructor, everything works fine, whereas if I did, the returned value of the pointer a would be increased by 8 at all times. Therefore, in this example, the last cout of the program will print 20 if I include the destructor and 0 if I don't, at all times. Why does this happen?

codingEnthusiast
  • 3,800
  • 2
  • 25
  • 37
  • Not the issue by why are you not using `size` in your `operator new[]`? – NathanOliver Jul 25 '17 at 17:16
  • As I said in the question, the code doesn't make sense at all. I just wanted to write a dummy example to highlight the problem. Afterall, this is not the only problem that the code has and I am aware of that, but it is easy to highlight the problem through it. – codingEnthusiast Jul 25 '17 at 17:18
  • Do you mean the value of ``a[0].n`` or the pointer ``a``? For the latter I would expect much bigger values. And do you repeat the ``new`` call within the ``main`` function or do you start the whole program several times with different outcome? – bjhend Jul 25 '17 at 17:22
  • Yes, this is indeed a duplicate, but not a homework question. – codingEnthusiast Jul 25 '17 at 17:27

2 Answers2

4

Array-new-expressions pass an unspecified amount of overhead to the allocation function (i.e. your operator new overload). This is to allow the implementation to record the number of array elements, so that destructors can be called on deletion.

If your implementation detects that the class doesn't need destructors to be called (because it is trivially destructible), it may choose to not require the same amount of overhead as it would otherwise.

The formal wording is 8.3.4[expr.new]p11:

When a new-expression calls an allocation function and that allocation has not been extended, the new- expression passes the amount of space requested to the allocation function as the first argument of type std::size_t. That argument shall be no less than the size of the object being created; it may be greater than the size of the object being created only if the object is an array. [...]

Note that the overhead is unspecified, so it could in principle be different at every call! (This is also the reason that placement-array-new is unusable.) However, the Itanium ABI that many implementations use is fairly specific in how the "array cookie" works, and matches your experience.

Kerrek SB
  • 464,522
  • 92
  • 875
  • 1,084
2

This is the Itanium CXX ABI array operator new[] cookie:

When operator new is used to create a new array, a cookie is usually stored to remember the allocated length (number of array elements) so that it can be deallocated correctly. [...]
No cookie is required if the array element type T has a trivial destructor (12.4 [class.dtor]) [...]

A destructor, other than one defined within the class as = default, is non-trivial.

ecatmur
  • 152,476
  • 27
  • 293
  • 366