0

In C we learn that at the end of the program, in which we allocate memory dynamically, we need to free it otherwise there is a memory leakage.

#include <stdio.h>
int a = 17;
int main(void)
{
  int b = 18; //automatic stack memory
  int * c;
  c = malloc( sizeof( int ) ); //dynamic heap memory
  *c = 19;
  printf("a = %d at address %x\n", a, &a);
  printf("b = %d at address %x\n", b, &b);
  printf("c = %d at address %x\n", *c, c);
  free(c);  
  system("PAUSE");  
  return 0;
}

My question is that why do we need to do it manually? won't the memory get freed by itself when the program ends(like in the above example) ?

syed_noorullah
  • 155
  • 2
  • 13
  • 2
    It will get freed by most operating systems after the program terminates. There is a debate regarding whether you should rely on that, but in my opinion it's always best to free any memory you allocate. And of course any serious program makes sure there is no leakage at any time. – Banex Aug 28 '17 at 18:55
  • From wikipedia "Failure to deallocate memory using free leads to buildup of non-reusable memory, which is no longer used by the program. This wastes memory resources and can lead to allocation failures when these resources are exhausted. " Basically a program that is running for a long time can end up eating a lot of memory and if you don't ever deallocate until the program closes, it can slow down the whole system. – odin Aug 28 '17 at 18:57
  • If you have secret information such as passwords or keys, you should also clear the memory before freeing it. – stark Aug 28 '17 at 18:59
  • @odin As explained in the answer I just linked to, that is flat-out wrong on modern operating systems. If Wikipedia says that, Wikipedia needs to be corrected. – zwol Aug 28 '17 at 19:11
  • @zwol what odin was trying to say, is that in a long running program you should `free` memory when not required. How is that affected by a modern OS? Does it have time-out on memory usage? Is allocated but unused memory harmlessly "parked" in a virtual system? – Weather Vane Aug 28 '17 at 19:58
  • Imagine if you were using Facebook, and you had to close your browser every five minutes because it never `free`d its objects when it was done with them! Imagine if leaving your browser open would cause other programs (and your browser itself) to complain that you're "out of memory"... How happy would you be about this? Because this is the world you're suggesting, and I want nothing to do with it! – autistic Aug 28 '17 at 20:21
  • @zwol As explained in the answer you just linked to, "You may decide you want to integrate your small program into a larger, long running one."... and since the longer-running one (potentially infinitely running) takes longer to exit (possibly never exits), that *leaked memory* "leads to buildup of non-reusable memory, which is no longer used by the program", according to Wikipedia... I see no problem, do you? – autistic Aug 28 '17 at 20:35
  • @seb I don't remember the second sentence of odin's comment being there when I wrote my comment. Without that, it sounds like a claim that operating systems _don't_ deallocate memory on exit. In the context of long-lived programs, certainly memory should be deallocated when you are through with it. – zwol Aug 28 '17 at 21:31
  • @zwol Nice theory, but invalid. The C standard doesn't require that `main` be the entry point for a program; in fact, a conforming freestanding program may exist as a dynamic library loaded (and subsequently unloaded) by a webserver... Do you think `malloc`'d memory is going to be automatically `free`'d by the webserver, when the library unloads itself? It seems like a lot more work for the webserver, than for the library, doesn't it? – autistic Aug 28 '17 at 22:14
  • @zwol ... and what relevance does the OS (which isn't required) have, here? Sure, the OS might clean up after the webserver, when the webserver goes down... If this is what you expect to happen, you should leave this website! How the hell could the OS be expected to reclaim all resources from programs which it can't even be reasonably expected to see [edit: ... in a timely fashion]? – autistic Aug 28 '17 at 22:16
  • @zwol Regardless, I think you should start citing reputable experts from now on (everywhere, not just in this post)... For example, "When we are done using a chunk of allocated memory, we call free to inform the system that the chunk is available for later use." and "... memory leaks are subtle defects that plague many large systems." -- Sedgewick. – autistic Aug 28 '17 at 22:29
  • 1
    In `printf("c = %d at address %x\n", *c, c);` change `%x` tp `%p` – EsmaeelE Aug 28 '17 at 22:52
  • 1
    Never use `system("PAUSE");` it cause dependency. you simply can use `getchar()` see this [`system("pause") hell`](https://stackoverflow.com/questions/24776262/pause-console-in-c-program) – EsmaeelE Aug 28 '17 at 23:30
  • @EsmaeelE I'd tend to discourage `getchar()`, too, as it promotes a lack of understanding of the bigger picture. Sure, there are more reasons to discourage `system("pause")`, but one of those is the same for both: writing shell scripts to interoperate with console programs becomes tedious when those programs do things like this... – autistic Aug 29 '17 at 00:22
  • When you do want to keep the window open for your own development purposes, perhaps `assert(0);` might be most beneficial. Never be afraid to learn new things :) – autistic Aug 29 '17 at 00:25
  • @Seb thanks, I try to change `getchar()` by `assert(0)` but after execution terminal show: `Assertion `0' failed. Aborted (core dumped)` and not behave like `getchar()` – EsmaeelE Aug 29 '17 at 00:41
  • @EsmaeelE That's intentional. `assert(0)` should cause your debugger to break, and so your console window will remain open (that's what you wanted, right?)... but only in debug mode, since `assert` will be optimised away in release builds... that's also what you want, right? If not, then it's clear to me that you shouldn't be programming for the console which you obviously never use! Right? – autistic Aug 29 '17 at 16:02
  • @EsmaeelE Please look up "assert" in your dictionary, and come to a logical deduction with regards to what it means in computer software. It's an invaluable tool which you're missing out on! – autistic Aug 29 '17 at 16:03

0 Answers0