0

From the textbook it states that when you allocate memory using new[], the compiler will allocate extra 4 bytes to keep track of the array size. I wonder where is that extra 4 bytes are stored? How to explain the following core dump?

#include <iostream>
using namespace std;

class A
{
    int m;
};

int main()
{
    A* a = new A[10];
    A* b = a + 3;
    delete[] b;
    delete[] a;
}
Han
  • 55
  • 5
  • 1
    The compiler *might* store the allocation info that way, or it might not. The standard does not say how to track array size. A perfectly valid implementation of `new[]()` is to forward its calls to plain old `new()` (and the same for delete). – Joel Cornett Jan 17 '16 at 01:53
  • I am just curious how the default version of new[]() and delete[]() could tell the difference between `a` and `b`; is C++ have any guidance to avoid this kind of error? – Han Jan 17 '16 at 01:57
  • 1
    @JoelCornett: `operator new[]()` can forward its calls to `operator new()`, true, but the compiler only calls `operator new[]()` *after* calling the destructor the right number of times, which requires storing the size somewhere. The Standard doesn't say how. – Ben Voigt Jan 17 '16 at 02:07
  • 2
    @Han: It can't and it won't try to. Ensuring that things like `delete[] b;` don't happen is your responsibility, not the C++ execution environment's. The execution of the erroneous `delete` is undefined bwhaviour, q.v. Also, if the objects being deleted have trivial destructors, there is no need to know how many of them there are, only how much space is occupied, and the underlying memory management library probably already knows that. – rici Jan 17 '16 at 02:09
  • @benvoigt: if the compiler knows the destructor does nothing, it has no obligation to go through the motions of calling it. In which case, a count is not required. – rici Jan 17 '16 at 02:11
  • @rici: Right... but the Standard also doesn't require zero overhead in case of trivial destructors. – Ben Voigt Jan 17 '16 at 02:12
  • 2
    all around, seems a duplicate of http://stackoverflow.com/q/197675/103167 – Ben Voigt Jan 17 '16 at 02:14

2 Answers2

1

I wonder where is that extra 4 bytes are stored?

There are two common approaches. One is to store it right before the returned address. The other is to store it in a separate associative container indexed by the returned address.

How to explain the following core dump?

Likely you were on a platform that stores the number of elements prior to the returned address. The second delete[] probably got garbage for the size. But the exact way it fails will depend on the platform, any number of horrible things could happen.

David Schwartz
  • 179,497
  • 17
  • 214
  • 278
  • Can you elaborate on the second answer? if i am on the platform storing the number of elements prior to the returned address, when I do "delete[] b" the 4 byte before b is 0, so should it do nothing and everything should be fine? Also a side question, why C++ disallow `A a[] = new A[10]` so i can assume a is a const pointer? – Han Jan 17 '16 at 01:49
  • 1
    @Han: If you want '`a` is a `const` pointer', then just say so. `A* const a = new A[10];` is perfectly legal. `A[]` does not mean the same as `A* const` (except in an argument list) – Ben Voigt Jan 17 '16 at 02:08
  • There are many more allocation schemes around. The allocator might store a pointer to the next block in front of the actual data, or segment the memory by powers of two. It might pool blocks of similar size, etc. All of them have their pros and cons. Have a look at "The Art of Computer Programming" for more. – cdonat Jan 17 '16 at 07:26
  • @Han Who knows how many bytes are used. Maybe it's 4, maybe it's not. Who knows what zero encodes. Maybe it means a minimum sized block, maybe not. And who knows what `delete[]` does *after* it invokes any necessary destructors. Maybe it tried to link it to the last block released and the overlap of the two blocks caused it to crash. Also, some platforms know that some types don't require any destructor and might not allocate space for the size in that case. – David Schwartz Jan 17 '16 at 20:20
  • Thanks guys. It is helpful. – Han Jan 18 '16 at 00:36
1

It sounds like you have a bad textbook.

A a practicable matter, every new implementation adds overhead or uses functions that add overhead. That overhead could be 4, 8, 16, whatever bytes. To say it's 4 is incorrect.

Invariably, overhead is added in front/below the memory returned by new. However, many allocators also add memory at the end that is used to check for overruns.

Assuming that the overhead is 4 bytes (and int is 4 bytes):

 A* a = new A[10];
 unsigned int *overhead = reinterpret_cast<int*>(&a[-4]) ;

In your case

int main()
{
    A* a = new A[10];
    A* b = a + 3;
    delete[] b; // 1
    delete[] a; // 2
}

delete 1, attempts to delete a block not allocated by new. Some new implementations could catch this.

delete 2, attempts to delete after the heap has been corrupted by 1.

user3344003
  • 20,574
  • 3
  • 26
  • 62