1

I know that I am supposed to use std::vector or other alternatives, but there's something I don't understand with pointers.

When we create a pointer array:

int* a = new int[100];

It should allocate a space of 100 * sizeof(int), isn't it?

When we don't need it anymore, we do:

delete [] a;

1.

This is wrong:

delete a;

But when I actually do this (once accidentally in a recreational program), no runtime error (unllike in part 3 below) seems to be triggered and the program runs normally (despite possible memory leaks). Why? And does it actually deleted (freed) the first element of the array?


2.

According to a question on StackOverflow, delete[] knows the size of the array that it needs to delete.

But what happens when I delete the wrong thing? (It causes a runtime error... on codepad.org it shows memory clobbered before allocated block and something Debug Assertion Failed in VS2010.) Why don't it just delete (free) elements 1 to 99?

delete [] &a[1]; // or
delete [] (a + 1);

3.

The following code also shows memory clobbered before allocated block. But why don't it just delete (free) the element 99? And why does it cause an error, but just delete a like in part 1 doesn't?

delete &a[99]; //or
delete (a + 99);

Did C++ standards actually states what will happen for the above things?

Community
  • 1
  • 1
Alvin Wong
  • 12,210
  • 5
  • 51
  • 77
  • The memory layout for these situations are somewhat implementation defined. We **should not** mess with these things. – Mark Garcia Jan 01 '13 at 06:42
  • 2
    Using the wrong form of `delete` is UB. Simple as that. – chris Jan 01 '13 at 06:42
  • As to 1., This is undefined behavior, so it worked with this code and this compiler, try it on 3 other ones and it'll probably start crashing – daniel gratzer Jan 01 '13 at 06:43
  • So, undefined behavior is the universal explanation of something that doesn't follow the rules? – Alvin Wong Jan 01 '13 at 06:57
  • 1
    @AlvinWong : Undefined behavior is an exception situation which should be avoided as far as developer is concerned, and handling of this is left for the compiler makers/vendors. And from the developers point of view how compilers are handling it doesn't have significance, only thing is that it should be avoided. – NeonGlow Jan 01 '13 at 07:12
  • What "undefined behavior" means is exactly what it says: the C++ language does not define what happens when you do this. The compiler could give you an compiler error, or try to guess what you really wanted and generate the code for that, or generate code that erases your hard drive. (Actually, in some cases, it's not allowed to generate an error, but that's a minor quibble.) – abarnert Jan 01 '13 at 07:42
  • Also, for 3, what did you _want_ it to do? What should happen if you freed the memory for `a[99]`, or `a[50]`, and then later correctly called `delete [] a`? Most likely what you're _really_ looking for is to destroy the object at `a[99]` (although that's kind of silly for `int`, presumably this is just a toy example). But you can't even really do that, because when you later `delete [] a`, it's going to destroy `a[99]` again. What you _can_ do is `swap` it with an empty object, `move` it to another object, or just assign an empty object in its place, and for many types, that's good enough. – abarnert Jan 01 '13 at 07:46

2 Answers2

6
  1. According to the C++ standard, deleting objects allocated with new[] using delete is undefined behavior. Anything can happen - from "nothing" to corrupting your heap beyond all recognition. It makes no sense to say what exactly gets deleted in cases like that, because it is heavily system-dependent.

  2. Deleting "a wrong thing" is undefined behavior. Typically, this corrupts your heap, because common implementations of allocators expect certain values to be stored just prior to the address that you pass for deallocation. This is by no means a standard, so again, anything can happen.

  3. It does not delete the element 99 because it never allocated element 99: it allocated an array of 100 items, not a single int; it expects you to deallocate all 100 items at once - you cannot deallocate it in any other way.

Sergey Kalinichenko
  • 714,442
  • 84
  • 1,110
  • 1,523
  • 2
    Standard Reference: C++11 3.7.4.2-p3, "...the behavior is undefined if the value supplied to operator delete(void*) in the standard library is not one of the values returned by a previous invocation of either operator new(std::size_t) or operator new(std::size_t, const std::nothrow_t&) in the standard library, and the behavior is undefined if the value supplied to operator delete[](void*) in the standard library is not one of the values returned by a previous invocation of either operator new[](std::size_t) or operator new[](std::size_t, const std::nothrow_t&) in the standard library. – WhozCraig Jan 01 '13 at 07:26
4

Why? And does it actually deleted (freed) the first element of the array?

No, technically it caused an Undefined Behavior. Which means anything can happen.

The rule for using new or new [] are pretty simple:

  • If you used new you must use delete, if you use new [] you must use delete[] for deallocation.
  • You should pass the address returned by new or new [] to delete or delete [].

You don't follow the rules, You end up with Undefined Behavior.

Why all this?

Because the C++ standard says so.
The standard provides you the functionality to manage lifetime and memory of your objects through dynamic memory allocations, in return you make a contract to follow the rules that standard mandates for getting this functionality.
You break the contract you suffer the penalty.

In short,

"With greater power comes greater responsibility"

Alok Save
  • 202,538
  • 53
  • 430
  • 533