2

Someone on IRC claimed that, although allocating with new[] and deleting with delete (not delete[]) is UB, on Linux platforms (no further details about the OS) it would be safe.

Is this true? Is it guaranteed? Is it to do with something in POSIX that specifies that dynamically-allocated blocks should not have metadata at the start?

Or is it just completely untrue?


Yes, I know I shouldn't do it. I never would.
I am curious about the veracity of this idea; that's it!


By "safe", I mean: "will not cause behaviour other than were the original allocation performed by new, or were the de-allocation performed by delete[]". This means that we might see 1 "element" destruction or n, but no crashing.

Lightness Races in Orbit
  • 378,754
  • 76
  • 643
  • 1,055
  • 2
    I doubt this is ever safe for types with a non-trivial destructor, and even if it is that would depend more on the compiler than the OS, I think. – Sven Jan 20 '12 at 10:41
  • I don't think POSIX deals with C++ at all. – Cat Plus Plus Jan 20 '12 at 10:42
  • What does `safe` mean to you? Safe as in: "the memory was freed" or safe as in: "**all** the destructors were executed". – Matthieu M. Jan 20 '12 at 10:44
  • @MatthieuM.: The former, as if `new` had been used instead (I guess), with none of the typical symptoms of UB code – Lightness Races in Orbit Jan 20 '12 at 10:45
  • If there is such situation, when it is safe, it's fully dependant on `malloc()` and `free()` implementation - OS has nothing to do with userspace heap. – Griwes Jan 20 '12 at 10:45
  • 3
    How `new[]` and `delete` expressions are implemented depends on the compiler/std lib, not on the OS. I don't know how GCC deals with it, but I _very_ seriously doubt `delete new std::string[10]` would call ten dtors. – sbi Jan 20 '12 at 10:46
  • @Griwes: This really doesn't depend on `malloc` in any way. `free` never calls C++ destructors. – Cat Plus Plus Jan 20 '12 at 10:47
  • @CatPlusPlus, well, kinda right, but it depends on definition of "safe", again. – Griwes Jan 20 '12 at 10:50
  • 1
    Any definition of "safe" which excludes "respects invariants and postconditions set by the core language" isn't worth having. – spraff Jan 20 '12 at 10:58
  • 5
    FWIW, here's how safe it looks on ideone, even with a single destructor to call: http://ideone.com/ZlPDM. I get the same behaviour on my Linux box. – R. Martinho Fernandes Jan 20 '12 at 10:59
  • 2
    [This article might be of interest](http://web.archive.org/web/20080703153358/http://taossa.com/index.php/2007/01/03/attacking-delete-and-delete-in-c) – Jesse Good Jan 20 '12 at 13:10
  • @Jesse: Ah, absolutely! Thanks. – Lightness Races in Orbit Jan 20 '12 at 13:15
  • possible duplicate of [Is delete\[\] equal to delete?](http://stackoverflow.com/questions/1553382/is-delete-equal-to-delete) – sbi Feb 16 '12 at 10:20

5 Answers5

13

Of course it's not true. That person is mixing up several different concerns:

  • how does the OS handle allocations/deallocations
  • correct calls to constructors and destructors
  • UB means UB

On the first point, I'm sure he's correct. It is common to handle both in the same way on that level: it is simply a request for X bytes, or a request to release the allocation starting at address X. It doesn't really matter if it's an array or not.

On the second point, everything falls apart. new[] calls the constructor for each element in the allocated array. delete calls the destructor for the one element at the specified address. And so, if you allocate an array of objects, and free it with delete, only one element will have its destructor invoked. (This is easy to forget because people invariably test this with arrays of ints, in which case this difference is unnoticeable)

And then there's the third point, the catch-all. It's UB, and that means it's UB. The compiler may make optimizations based on the assumption that your code does not exhibit any undefined behavior. If it does, it may break some of these assumptions, and seemingly unrelated code might break.

spraff
  • 32,570
  • 22
  • 121
  • 229
jalf
  • 243,077
  • 51
  • 345
  • 550
7

Even if it happens to be safe on some environment, don't do it. There's no reason to want to do it.

Even if it did return the right memory to the OS, the destructors wouldn't be called properly.

It's definitely not true for all or even most Linuxes, your IRC friend is talking bollocks.

POSIX has nothing to do with C++. In general, this is unsafe. If it works anywhere, it's because of the compiler and library, not the OS.

spraff
  • 32,570
  • 22
  • 121
  • 229
  • Ok, on *my* Linux distro (Kubuntu), it definitely is not safe. – spraff Jan 20 '12 at 10:41
  • I guess it fully depends on `malloc()` detailed implementation; so, probably, it have nothing to do with OS, but depends on stdlib implementation. Then, using specific library impl., it could be safe on every OS. But don't do it. It's silly and doesn't make any sense. – Griwes Jan 20 '12 at 10:42
  • Best example is using `delete` instead of `delete[]` and leaking memory (or other resources) because of dtors not being run. I doubt that any implementation will "fix" that. – PlasmaHH Jan 20 '12 at 10:53
  • @Griwes: `malloc()` is not the guilty party. In both cases of `new` and `new[]` it just provides some bare memory. However in one case the operator also has to store information about how many destructors to call and where to find these objectes on which to call them. – LiKao Jan 20 '12 at 11:21
  • @LiKao, trueness of your sentence depends on definition of "safe". – Griwes Jan 20 '12 at 12:01
  • @Griwes: I wasn't talking about safety in my sentence. I was correcting your sentence. There is nothing in my sentence which depends on any definition of safety at all. – LiKao Jan 20 '12 at 12:34
  • @LiKao, but the OP was talking about safety. And I was talking about "is not the guilty party". – Griwes Jan 20 '12 at 12:52
  • @Griwes: Sure, I got that. But `malloc()` or the cstdlib isn't in any way responsible for anything having to do with the difference between `new[]` or `new` and `delete[]` or `delete`. There is some more magic after the call to malloc involved, ussually by adding some additional information at the beginning of the block and then returning an offseted pointer. The correct offset has to substracted again before calling `free` from within `delete` or `delete[]`. This offset is calculated differently within `delete` and `delete[]`. Hence in one case `free()` just get's passed an invalid pointer. – LiKao Jan 20 '12 at 14:18
  • @Griwes: Ok, just looked up the standard for `free()`. Funny enough I always thought it had to crash when the pointer passed to is was not malloced (or previously freed). Seems like this is "only" UB, so there may be implementations which just discard such "invalid" pointers. I guess then it depends both on the implementation of `malloc()`/`free()` as well as `new[]`/`delete[]`. – LiKao Jan 20 '12 at 14:32
3

This question discusses in great details when exactly mixing new[] and delete looks safe (no observable problems) on Visual C++. I suppose that by "on Linux" you actually mean "with gcc" and I've observed very similar results with gcc on ideone.com.

Please note that this requires:

  1. global operator new() and operator new[]() functions to be implemented identically and
  2. the compiler optimizing away the "prepend with number of elements" allocation overhead

and also only works for types with trivial destructors.

Even with these requirements met there's no guarantee it will work on a specific version of a specific compiler. You'll be much better off simply not doing that - relying on undefined behavior is a very bad idea.

Community
  • 1
  • 1
sharptooth
  • 167,383
  • 100
  • 513
  • 979
2

It is definitely not safe as you can simply try out with the following code:

#include<iostream>

class test {
public:
  test(){ std::cout << "Constructor" << std::endl; }
  ~test(){ std::cout << "Destructor" << std::endl; }
};

int main() {
  test * t = new test[ 10 ];
  delete t;
  return 1;
}

Have a look at http://ideone.com/b8BiQ . It fails misserably.

It may work when you do not use classes, but only fundamental types, but even that is not guaranteed.

EDIT: Some explanations for those of you who want to know why this crashes:

new and delete mainly serve as wrappers around malloc(), hence calling free() on a newed pointer is most of the time "safe" (remember to call the destructor), but you should not rely on it. For new[] and delete[] however the situation is more complicated.

When an array of classes gets constructed using new[] each default constructor will be called in turn. When you do delete[] each destructor gets called. However each destructor also has to be supplied a this pointer to use inside as a hidden parameter. So before calling the destructor the program has to find the locations of all objects within the reserved memory, to pass these locations as this pointers to the destructor. So all information that is later needed to reconstruct this information needs to be stored somewhere.

Now the easiest way would be to have a global map somewhere around, which stores this information for all new[]ed pointers. In this case if you delete is called instead of delete[] only one of the destructors would be called and the entry would not be removed from a map. However this method is usually not used, because maps are slow and memory management should be as fast as possible.

Hence for the stdlibc++ a different solution is used. Since only a few bytes are needed as additional information, it is the fastest to just over-allocate by these few bytes, store the information at the beginning of the memory and return the pointer to the memory after the bookkeeping. So if you allocate an array of 10 objects of 10 bytes each, the programm will allocate 100+X bytes where X is the size of the data which is needed to reconstruct the this.

So in this case it looks something like this

| Bookkeeping | First Object | Second Object |....
^             ^
|             This is what is returned by new[]
|
this is what is returned by malloc()

So in case you pass the pointer you have recieved from new[] to delete[] it will call all destructors, then substract X from the pointer and give that one to free(). However if you call delete instead, it will call a destructor for the first object and then immediately pass that pointer to free(), which means free() has just been passed a pointer which was never malloced, which means the result is UB.

Have a look at http://ideone.com/tIiMw , to see what gets passed to delete and delete[]. As you can see, the pointer returned from new[] is not the pointer which was allocated inside, but 4 is added to it before it is being returned to main(). When calling delete[] correctly the same four is substracted an we get the correct pointer within delete[] however this substraction is missing when calling delete and we get the wrong pointer.

In case of calling new[] on a fundamental type, the compiler immediately knows that it will not have to call any destructors later and it just optimizes the bookkeeping away. However it is definitely allowed to write bookkeeping even for fundamental types. And it is also allowed to add bookkeeping in case you call new.

This bookkeeping in front of the real pointer is actually a very good trick, in case you ever need to write your own memory allocation routines as a replacement of new and delete. There is hardly any limit on what you can store there , so one should never assume that anything returned from new or new[] was actually returned from malloc().

LiKao
  • 10,408
  • 6
  • 53
  • 91
  • That's not a proof of "safeness". Admittedly I didn't give a strict definition of "safe". – Lightness Races in Orbit Jan 20 '12 at 12:10
  • @LightnessRacesinOrbit: I never said it was a proof of safeness... It isn't. It's a proof of unsafenes. In this case not all constructors get called and a horrible glibc memory corruption error is produced (at least at ideone and my machine). This means it is unsafe. – LiKao Jan 20 '12 at 12:29
  • Er, I meant "unsafeness". And, sorry, I missed the SIGABRT completely. I take back my comment (though I still don't care how many destructors calls are made). – Lightness Races in Orbit Jan 20 '12 at 12:33
  • @LightnessRacesinOrbit: why not? So you're saying that it doesn't have to *work* in order to be "safe"? – jalf Jan 20 '12 at 12:38
  • @jalf: Essentially. This question's definition of safe -- though admittedly unclear -- was "will not cause behaviour other than were the original allocation a `new` _or_ the de-allocation a `delete[]`". – Lightness Races in Orbit Jan 20 '12 at 12:47
  • @LightnessRacesinOrbit: and the correct number of destructor calls is part of that behavior. Mixing them will change the number of destructor calls, even if everything else works. So by your definition, changing the number of destructor calls means it is not safe – jalf Jan 20 '12 at 13:03
  • @jalf: Nah, not so. Changing the number of destructor calls is covered by my "_or_". – Lightness Races in Orbit Jan 20 '12 at 13:05
  • I don't see how. If you use `new[N]` with `delete`, then N constructors and 1 destructor will be called. You say that if the behavior matches *one* of these two cases, it is "safe: (a)`new`/`delete` or (b)`new[N]`/`delete[]`. Case (a) will call 1 ctor and 1 dtor, so it is different. Case (b) will call N ctors and N dtors, so it is different. If the behavior does not match *either* of the behaviors you consider safe, then it must be unsafe – jalf Jan 20 '12 at 13:09
  • @jalf: Please use `@replies`! changing the number of destructor calls leads is involved in (b), which is valid as we all know. – Lightness Races in Orbit Feb 08 '12 at 13:09
  • @LightnessRacesinOrbit: that made precisely no sense to me. – jalf Feb 08 '12 at 13:55
  • So you're unable to explain what you meant? "the number of destructor calls leads is involved" isn't a meaningful sentence in English. – jalf Feb 08 '12 at 13:57
0

I expect that new[] and delete[] just boil down to malloc() and free() under Linux (gcc, glibc, libstdc++), except that the con(de)structors get called. The same for new and delete except that the con(de)structors get called differently. This means that if his constructors and destructors don't matter, then he can probably get away with it. But why try?

Adrian Ratnapala
  • 5,485
  • 2
  • 29
  • 39
  • `new`, `delete`, `new[]` and `delete[]` may use extra accounting information. It is only guaranteed that the info generated by `new` must be understood by `delete` and the info generated by `new[]` must be understood by `delete[]`. Hence this might cause havok in other cases as well. – LiKao Jan 20 '12 at 14:29
  • Absolutely true. I am not talking about what is guaranteed, I am just talking about what is probably true for now. So yep, nobody should try this kind of trick. – Adrian Ratnapala Jan 21 '12 at 13:40
  • No it is not "probably true" that they just boil down to `malloc()` and `free()`. As you can see from the example I gave in the answer above there is much more going on besides the bare call to `malloc()` and `free()` and this extra logic usually leads to a heap corruption under Linux which is fatal for the program. The usual fastest trick to take care of the accounting info is to store it directly with the allocated block which is fatal when the wrong operator is called on it later. – LiKao Jan 22 '12 at 20:34