6

Does the new memory allocated always get freed up when program closes? (even if closes unexpectedly from a bug/error etc, or custom close functions)?

Or does it only free the memory if it returns from main?

Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
Tez
  • 517
  • 1
  • 7
  • 16
  • XP,7,8 should do the work, but you should check it yourself – Guy P Apr 14 '13 at 10:22
  • 3
    http://stackoverflow.com/questions/2975831/is-leaked-memory-freed-up-when-the-program-exits and http://stackoverflow.com/questions/12106349/memory-leaks-when-program-is-closed-with-x?rq=1 and http://stackoverflow.com/questions/15467298/how-far-can-memory-leaks-go?rq=1 and http://stackoverflow.com/questions/7472914/does-a-c-program-automatically-free-memory-when-it-crashes?rq=1 and http://stackoverflow.com/questions/7532529/return-of-memory-at-the-termination-of-a-c-program?rq=1 I'm sure there are even more :) – s3rius Apr 14 '13 at 10:23
  • So it's okay to use: exit, or abort/terminate in my error handling functions? – Tez Apr 14 '13 at 10:43

2 Answers2

5

Yes, operating systems usually keep track of the memory allocated by each process and release it when those processes terminate - no matter how.

This, however, is not a valid reason to have memory leaks in your program in general: a program should always actively release the resources (including memory) it acquires, unless there is a really good - and documented - reason for not doing so.

Good reasons could be the dependency of a program's correctness on the order of destruction of global/singleton objects, or the expensiveness of actively freeing allocated memory before termination.

However, while admitting that there could be reasons why a programmer would intentionally avoid releasing memory, please be careful not to develop a too shallow mindset as to what counts as a "good reason" for not cleaning after yourself.

I would encourage you to get used to write code that does release the memory it acquires, and document explicitly in a very clear form every situation where you are not going to follow this practice. Again, while there might be corner-cases that require this, releasing or not releasing acquired memory always has to be an active, intentional decision of the programmer.


NOTE: Quoting Steve Jessop from the comments, another good reason why you would not want to actively release memory is when your program needs to be terminated because it somehow reached an unexpected state - perhaps one that violates an invariant, or a pre-condition of a certain function. Usually, violating a precondition means Undefined Behavior.

Since - by definition - there is no sane way to recover from UB, you may want to immediately terminate your program rather than performing further actions that could have any outcome - including highly undesirable ones.

Community
  • 1
  • 1
Andy Prowl
  • 124,023
  • 23
  • 387
  • 451
  • How come some dodgy software when you close it, you computer still keeps running slow and you have to restart? – Tez Apr 14 '13 at 10:23
  • @Tez: There could be several reasons, but AFAIK this is very unlikely because of unreleased memory. Modern OSs (at least those that run on desktop computers, not sure about embedded OSs) do release the memory allocated by processes when those processes terminate. – Andy Prowl Apr 14 '13 at 10:24
  • 1
    @Tez Sometimes closing the application window doesn't terminate the entire program. Background processes that should normally be shut down might keep running and eat your system resources. – s3rius Apr 14 '13 at 10:26
  • 2
    The advice is incorrect. There is no black or white answer w.r.t memory release at the end of the program, the correct answer is: *It depends*. – Alok Save Apr 14 '13 at 10:27
  • @AlokSave: What advice are you referring to? Not to have memory leaks in a program? (if not, notice the use of the word "usually" in my answer) – Andy Prowl Apr 14 '13 at 10:28
  • *"a program should always actively release the resources (including memory) it acquires.*" <===== Not correct – Alok Save Apr 14 '13 at 10:28
  • 2
    @AlokSave: I would be glad if you could elaborate on that. Why should a program not close files it opened, close connections it opened, release memory it acquired, unlock global mutexes it locked, and so on? – Andy Prowl Apr 14 '13 at 10:29
  • Re the second paragraph: why? Normally, a singleton should _not_ be destructed, for example, as destructing it may cause order of destruction problems. – James Kanze Apr 14 '13 at 10:37
  • @AndyProwl: The answer is it depends because *resources* can mean *memory* or it can mean other constructs like mutexes, file handles etc. Note that I said *memory* not *resources*. What good does it do you to deallocate a block of memory, which will be reclaimed anyways? It could bite you badly if that class destructor has side effects but if not, there is nothing that actually makes u say *must* in fact to the contrary it is advisable to not deallocate it. for example: As James points out above, singletons. – Alok Save Apr 14 '13 at 10:37
  • @Tez The most likely reason, assuming performance returns after a restart, is that the program has started child processes, which continue running. Another possible reason (but a restart typically won't help here) is that the program has managed to fragment the hard disk. And finally, of course, if the program was running with administrator/super-user rights, who knows what it could have done with the OS. (You shouldn't normally be running everyday programs with administrator rights, but I suspect that a lot of home Windows systems do.) – James Kanze Apr 14 '13 at 10:40
  • @JamesKanze: OK, singletons are probably the only quasi-fair exception, but notice that singleton patterns are renowned to be troublesome and are discouraged (by the GoF as well), with global variables being preferred. But I would rather not enter a philosophical debate here – Andy Prowl Apr 14 '13 at 10:40
  • @AlokSave: As I wrote in my reply to James, singletons are a quasi-fair exception, but have issues of their own. Global variables should be preferred - and those *are* destructed at program termination. It is a matter of being a literate programmer: you should always clean after yourself. Yes, there are very corner-case exceptions to this rule, but answering "*It depends*" to such a question is not a good didactic advice IMHO. Although yes, I will add a footnote later (have to leave for a while now) – Andy Prowl Apr 14 '13 at 10:42
  • @AndyProwl: I do not wish to get in to a debate that is not based on technical facts. The technical & correct answer to this Q is *it depends*. Some answers are always grey and we should not try to force them to be black or white just for the sake of it. Your answer as it is now is, incorrect. It deserves a downvote and it has one from me now. If you correct it leave me a note and I will retract the downvote. – Alok Save Apr 14 '13 at 10:47
  • And of course, you did say "memory leaks". Not deleting a singleton isn't really a memory leak, even though some tools report it as such. (A more interesting case: in a program I wrote many years ago, which took tens of seconds to execute on some data sets, the profiler reported over a third of the run-time was taken freeing up a large tree structure that I'd built, just before terminating. So I simply terminated without freeing it. Probably not good programming practice, but "the profiler made me do it".) – James Kanze Apr 14 '13 at 10:49
  • @AndyProwl The main reason for using a singleton rather than an ordinary global variable (in C++) is to avoid order of construction and order of destruction issues. Global variables (rather than singletons) are often to be avoided because of this. (And yes, I agree that when this is the motivation, calling it a singleton is probably misleading. But because the most frequently used idiom derives from the GoF singleton pattern...) – James Kanze Apr 14 '13 at 10:52
  • I'm with AlokSave here. First you say that the OS does it for you, then you say that you should do it yourself anyway. See [this link](http://en.wikipedia.org/wiki/Anal_retentiveness). – TonyK Apr 14 '13 at 11:04
  • Just one final comment to think about. Even if you do call `delete` on everything, the memory probably won't be released to the OS until you exit the program anyway. – James Kanze Apr 14 '13 at 11:05
  • @AlokSave: I expanded my answer, thank you for contributing – Andy Prowl Apr 14 '13 at 11:28
  • @JamesKanze: I see your point. Thank you for your explanation – Andy Prowl Apr 14 '13 at 11:29
  • @AndyProwl: Downvote retracted & +1 for putting the answer in simplistic yet correct words. – Alok Save Apr 14 '13 at 11:33
  • For another concrete example, the questioner mentions bug/error situations. If an assertion fails then often that's good enough reason to exit without cleanup. The reason is that if the program reached a state that you (the programmer) don't understand and can't handle, then you have no good reason to expect the cleanup code to act correctly, and it might cause further problems (confusing output due to further errors or an infinite loop due to some data structure being broken, are both fairly plausible). AFAIK that's why `assert` calls `abort()` and not `exit()`. – Steve Jessop Apr 14 '13 at 12:38
  • @SteveJessop: Very good insight, thank you for that – Andy Prowl Apr 14 '13 at 12:49
  • @Steve: I added a note and quoted you. Thank you again – Andy Prowl Apr 14 '13 at 12:54
1

Not all operating systems (in modern OS this is not a problem) do this operation and you had better not rely on this property. You can have a look at here: What REALLY happens when you don't free after malloc?

Community
  • 1
  • 1
fatihk
  • 7,789
  • 1
  • 26
  • 48