31

When we say a program leaks memory, say a new without a delete in c++, does it really leak? I mean, when the program ends, is that memory still allocated to some non-running program and can't be used, or does the OS know what memory was requested by each program, and release it when the program ends? If I run that program a lot of times, will I run out of memory?

Baruch
  • 20,590
  • 28
  • 126
  • 201
  • possible duplicate of [Memory leak in C,C++; forgot to do free,delete](http://stackoverflow.com/questions/1232262/memory-leak-in-c-c-forgot-to-do-free-delete) – nmichaels Nov 10 '10 at 22:23
  • I'm learning c++, and am afraid that every time I run my program, I'm using more memory (I'm still working on my memory management skills) – Baruch Nov 10 '10 at 22:24
  • Also of http://stackoverflow.com/questions/104/anatomy-of-a-memory-leak – Christian Mann Nov 10 '10 at 22:25

9 Answers9

39

No, in all practical operating systems, when a program exits, all its resources are reclaimed by the OS. Memory leaks become a more serious issue in programs that might continue running for an extended time and/or functions that may be called often from the same program.

aschepler
  • 70,891
  • 9
  • 107
  • 161
  • ` all its resources ` - it is not true for some persistent objects, allocated by OS for process, like SysV IPC descriptors (there are some shm objects too), and some widows handlers. – osgx Dec 06 '10 at 18:05
  • So, is it better practice to always delete/free when there is a new/malloc ? – jokoon Mar 22 '11 at 07:52
20

On operating systems with protected memory (Mac OS 10+, all Unix-clones such as Linux, and NT-based Windows systems meaning Windows 2000 and younger), the memory gets released when the program ends.

If you run any program often enough without closing it in between (running more and more instances at the same time), you will eventually run out of memory, regardless of whether there is a memory leak or not, so that's also true of programs with memory leaks. Obviously, programs leaking memory will fill the memory faster than an identical program without memory leaks, but how many times you can run it without filling the memory depends much rather on how much memory that program needs for normal operation than whether there's a memory leak or not. That comparison is really not worth anything unless you are comparing two completely identical programs, one with a memory leak and one without.

Memory leaks become the most serious when you have a program running for a very long time. Classic examples of this is server software, such as web servers. With games or spreadsheet programs or word processors, for instance, memory leaks aren't nearly as serious because you close those programs eventually, freeing up the memory. But of course memory leaks are nasty little beasts which should always be tackled as a matter of principle.

But as stated earlier, all modern operating systems release the memory when the program closes, so even with a memory leak, you won't fill up the memory if you're continuously opening and closing the program.

Teekin
  • 12,581
  • 15
  • 55
  • 67
  • Curious: Why do all programs leak memory? Are you saying that this is a fundamental characteristic of the von Neumann architecture, or that all programs inevitably have coding errors? – Tom W Dec 07 '10 at 09:01
  • 2
    I think the point is that all programs *use* memory, some more efficiently than others. Some programs that technically don't leak still use memory very inefficiently (holding it allocated for far longer than they need to, even if it's eventually released). – Tim Martin Feb 18 '11 at 17:18
  • Not any program will run out of memory. Think of a bare-bones OS kernel that runs nonstop for years - it's also a program. – orip Aug 11 '11 at 21:32
  • 1
    I'm not sure which part of my response gave either the impression that all programs leak memory or that any program will eventually run out of memory. I said nothing to the effect of either, as far as I can tell. What I said is that any program *run often enough* will run out of memory, because each instance takes some memory, and since there's a limited amount of memory on any system, if you run enough instances of it, it will run out of memory eventually. I'm not frustrated or anything, but genuinely surprised. :) – Teekin Aug 13 '11 at 19:13
  • 3
    I know this is late, but just to clarify, my question was if I run the program multiple times *closing each instance* before running the next, not simultaneously. – Baruch Oct 02 '12 at 20:25
  • 1
    @Teekin I think the core issue is the use of the phrase `often enough`. I believe what you mean is that if you run enough instances of a program at the same time, you will eventually exhaust all available memory. However, the phrase `any program run often enough will run out of memory` implies that if I run cURL 1 million times in sequence it will eventually run out of memory, which of course isn't the case. –  Aug 28 '15 at 12:12
  • Teekin, if you're talking about running 1000 times the same program at the same time, then yes, obviously the OS will run out of memory. But running a clean program (without memory leak), many times, one after the other has closed, will never cause a memory leak. There is no actual reason for that to happen, – Matt Jul 27 '16 at 15:09
15

Leaked memory is returned by the OS after the execution has stopped.

That's why it isn't always a big problem with desktop applications, but its a big problem with servers and services (they tend to run long times.).

Lets look at the following scenario:

  1. Program A ask memory from the OS
  2. The OS marks the block X as been used by A and returns it to the program.
  3. The program should have a pointer to X.
  4. The program returns the memory.
  5. The OS marks the block as free. Using the block now results in a access violation.
  6. Program A ends and all memory used by A is marked unused.

Nothing wrong with that.

But if the memory is allocated in a loop and the delete is forgotten, you run into real problems:

  1. Program A ask memory from the OS
  2. The OS marks the block X as been used by A and returns it to the program.
  3. The program should have a pointer to X.
  4. Goto 1

If the OS runs out of memory, the program probably will crash.

Toon Krijthe
  • 52,876
  • 38
  • 145
  • 202
6

No. Once the OS finishes closing the program, the memory comes back (given a reasonably modern OS). The problem is with long-running processes.

nmichaels
  • 49,466
  • 12
  • 107
  • 135
5

When the process ends, the memory gets cleared as well. The problem is that if a program leaks memory, it will requests more and more of the OS to run, and can possibly crash the OS.

Femaref
  • 60,705
  • 7
  • 138
  • 176
4

It's more leaking in the sense that the code itself has no more grip on the piece of memory.

Joachim VR
  • 2,320
  • 1
  • 15
  • 24
3

The OS can release the memory when the program ends. If a leak exists in a program then it is just an issue whilst the program is running. This is a problem for long running programs such as server processes. Or for example, if your web browser had a memory leak and you kept it running for days then it would gradually consume more memory.

mikej
  • 65,295
  • 17
  • 152
  • 131
3

As far as I know, on most OS when a program is started it receives a defined segment of memory which will be completely liberated once the program is ended.

Memory leaks are one of the main reason why garbage collector algorithms were invented since, once plugged into the runtime, they become responsible in reclaiming the memory that is no longer accessible by a program.

jdecuyper
  • 3,934
  • 9
  • 39
  • 51
2

Memory leaks don't persist past end of execution so a "solution" to any memory leak is to simply end program execution. Obviously this is more of an issue on certain types of software. Having a database server which needs to go offline every 8 hours due to memory leaks is more of an issue than a video game which needs to be restarted after 8 hours of continual play.

The term "leak" refers to the fact that over time memory consumption will grow without any increased benefit. The "leaked" memory is memory neither used by the program nor usable by the OS (and other programs).

Sadly memory leaks are very common in unmanaged code. I have had firefox running for a couple days now and memory usage is 424MB despite only having 4 tabs open. If I closed firefox and re-opened the same tabs memory usage would likely be <100MB. Thus 300+ MB has "leaked".

Gerald Davis
  • 4,541
  • 2
  • 31
  • 47
  • 2
    Your Firefox example is not necessarily a leak. Programs may keep a previously allocated memory in a "pool" and reuse it instead of releasing and reallocating every time it is needed. It is only a "leak" if the program no longer holds a reference to it, thus making it unreachable. – Baruch Oct 09 '12 at 07:47