2

My question is about the absolute scope of memory allocated on the heap. Say you have a simple program like

class Simple
{
private:
  int *nums;
public:
  Simple()
  {
    nums = new int[100];
  }
  ~Simple()
  {
    delete [] nums;
  }
};

int main()
{
  Simple foo;
  Simple *bar = new Simple;
}

Obviously foo falls out of scope at the end of main and its destructor is called, whereas bar will not call its destructor unless delete is called on it. So the Simple object that bar points to, as well as the nums array, will be lost in the heap. While this is obviously bad practice, does it actually matter since the program ends immediately after? Am I correct in my understanding that the OS will free all heap memory that it allocated to this program once it ends? Are the effects of my bad decisions limited to the time it runs?

  • In C++, Class declarations end in a semicolon. As of now your program won't even compile. – scohe001 Jul 16 '14 at 19:17
  • 2
    [here](http://stackoverflow.com/questions/273209/are-memory-leaks-ever-ok) you should find the answer – lelloman Jul 16 '14 at 19:17
  • Dur. You are correct. Fixed. – Brendan Wilson Jul 16 '14 at 19:18
  • It's not good practice, but the OS will always clean up after you and deallocate any allocated memory you had (in reality, the OS only cares about virtual memory mappings—it doesn't know or care how `malloc()` is implemented, it just deals with calls to `mmap(2)`/`munmap(2)` or equivalent to map/unmap virtual memory addresses). – Adam Rosenfield Jul 16 '14 at 19:19
  • Ok. That's what I thought. I just needed confirmation of that. Thank you. – Brendan Wilson Jul 16 '14 at 19:27
  • While your OS should handle the leak, neglecting to call destructors can have **any** side-effect. Your example would be fine, but code in a destructor can certainly have effects that last after program execution. – Drew Dormann Jul 16 '14 at 19:31
  • The C++ standard is vague about that: It's UB "if the program depends on the side effects of the destructors". So if the destructor flushes a database connection, losing which will get you fired, don't blame C++. – Kerrek SB Jul 16 '14 at 19:34
  • Definitely depends on the destructors, I like ensuring that memory is zeroed out in my destructors. Especially classes that contain security information. If everyone zeroed out their memory, maybe HeartBleed wouldn't have been so bad... – RoraΖ Jul 16 '14 at 19:40
  • @raz: Be sure to use a function which won't be optimized away then... – Deduplicator Jul 16 '14 at 19:53

1 Answers1

3

Any modern OS will reclaim all the memory allocated by any process after it terminates.
Each process has it's own virtual address space in all common operating systems nowdays, so it's easy for the OS to claim all memory back.
Needless to say it's a bad practice to rely on the OS for that.
It essentially means such code can't be used in a program that runs for a long while. Also, in real world applications destructors may do far more than just deallocate memory.
A network client may send a termination message, a database related object may commit transactions, and a file wrapping object may write some closure data to it's file.
In other words: don't let your memory leak.

Moshe Gottlieb
  • 3,963
  • 25
  • 41
  • Is it a bad practice? Modern apple, firefox and others actively do not clean up memory on exit, for a speed gain on shutdown which can be ten minutes in some cases. – Deduplicator Jul 16 '14 at 19:51
  • @Deduplicator In art, there's no such thing as "there's no such thing". I suppose every thing can be put to good use, not that I'm aware of anyone doing it. I will rephrase - In general, it is a bad practice. – Moshe Gottlieb Jul 17 '14 at 15:40