12

Suppose I had a program like this:

int main(void)
{
    int* arr = new int[x];
    //processing; neglect to call delete[]
    return 0;
}

In a trivial example such as this, I assume there is little actual harm in neglecting to free the memory allocated for arr, since it should be released by the OS when the program is finished running. For any non-trivial program, however, this is considered to be bad practice and will lead to memory leaks.

My question is, what are the consequences of memory leaks in a non-trivial program? I realize that memory leaks are bad practice, but I do not understand why they are bad and what trouble they cause.

Tyler Gaona
  • 479
  • 1
  • 4
  • 14
  • 4
    Memory leaks permanently and uselessly consume system resources, which are finite. Is that not a sufficient reason? – Nemo Jan 19 '14 at 18:51
  • Memory leaks are at their worst when they're in a loop. Think about a game which iterates a loop for every update step and has a memory leak in that loop. – Joseph Mansfield Jan 19 '14 at 18:51
  • 4
    You will run low on memory, but before that happens your system will slow down because of paging – marcinj Jan 19 '14 at 18:51
  • Possible duplicate of [Are memory leaks ever ok?](http://stackoverflow.com/questions/273209/are-memory-leaks-ever-ok). – FreeNickname Jan 19 '14 at 18:52
  • @Leeor Running out of memory is the obvious downfall. I meant are there other negative aspects that can result from leaking memory? – Tyler Gaona Jan 19 '14 at 18:58
  • If your using windows, the os will free the memory for you with this specific piece of code but they are some scenarios where this might not be the case (source: very expirienced greyhat). – Sebastian Hoffmann Jan 19 '14 at 18:59
  • 2
    @TylerGaona, sorry, had to :) But marchin_j is right - you'll eventually start swapping pages to disk, so you may be looking at a huge slowdown. – Leeor Jan 19 '14 at 19:00
  • @KerrekSB I wouldn't actually write code like the **example** I provided. I was asking about memory leaks in general. – Tyler Gaona Jan 19 '14 at 19:01

6 Answers6

17

A memory leak can diminish the performance of the computer by reducing the amount of available memory. Eventually, in the worst case, too much of the available memory may become allocated and all or part of the system or device stops working correctly, the application fails, or the system slows down unacceptably due to thrashing.

Memory leaks may not be serious or even detectable by normal means. In modern operating systems, normal memory used by an application is released when the application terminates. This means that a memory leak in a program that only runs for a short time may not be noticed and is rarely serious.

Much more serious leaks include those:

  • where the program runs for an extended time and consumes additional memory over time, such as background tasks on servers, but especially in embedded devices which may be left running for many years
  • where new memory is allocated frequently for one-time tasks, such as when rendering the frames of a computer game or animated video
  • where the program can request memory — such as shared memory — that is not released, even when the program terminates
  • where memory is very limited, such as in an embedded system or portable device
  • where the leak occurs within the operating system or memory manager
  • when a system device driver causes the leak
  • running on an operating system that does not automatically release memory on program termination. Often on such machines if memory is lost, it can only be reclaimed by a reboot, an example of such a system being AmigaOS.

Check out here for more info.

herohuyongtao
  • 49,413
  • 29
  • 133
  • 174
5

There is an underlying assumption to your question:

The role of delete and delete[] is solely to release memory.

... and it is erroneous.

For better or worse, delete and delete[] have a dual role:

  1. Run destructors
  2. Free memory (by calling the right overload of operator delete)

With the corrected assumption, we can now ask the corrected question:

What is the risk in not calling delete/delete[] to end the lifetime of dynamically allocated variables ?

As mentioned, an obvious risk is leaking memory (and ultimately crashing). However this is the least of your worries. The much bigger risk is undefined behavior, which means that:

  • compiler may inadvertently not produce executable code that behaves as expected: Garbage in, Garbage out.
  • in pragmatic terms, the most likely output is that destructors are not run...

The latter is extremely worrisome:

  • Mutexes: Forget to release a lock and you get a deadlock...
  • File Descriptors: Some platforms (such as FreeBSD I believe) have a notoriously low default limit on the number of file descriptors a process may open; fail to close your file descriptors and you will not be able to open any new file or socket!
  • Sockets: on top of being a file descriptor, there is a limited range of ports associated to an IP (which with the latest version of Linux is no longer global, yeah!). The absolute maximum is 65,536 (u16...) but the ephemeral port range is usually much smaller (half of it). If you forget to release connections in a timely fashion you can easily end up in a situation where even though you have plenty of bandwidth available, your server stops accepting new connections because there is no ephemeral port available.
  • ...

The problem with the attitude of well, I got enough memory anyway is that memory is probably the least of your worries simply because memory is probably the least scarce resource you manipulate.

Of course you could say: Okay, I'll concentrate on other resources leak, but tools nowadays report them as memory leaks (and it's sufficient) so isolating that leak among hundreds/thousands is like seeking a needle in a haystack...

Note: did I mention that you can still run out of memory ? Whether on lower-end machines/systems or on a restricted processes/virtual-machines memory can be quite tight for the task at hand.

Note: if you find yourself calling delete, you are doing it wrong. Learn to use the Standard Library std::unique_ptr and its containers std::vector. In C++, automatic memory management is easy, the real challenge is to avoid dangling pointers...

Matthieu M.
  • 287,565
  • 48
  • 449
  • 722
3

Let's say we have this program running:

while(true)
{
    int* arr = new int;
}

The short term problem is that your computer will eventually run out of memory and the program will crash.

Instead, we could have this program that would run forever because there is no memory leak:

while(true)
{
    int* arr = new int;
    delete arr;
}

When a simple program like this crashes there is no long term consequences because the operating system will free the memory after the crash.

But you can imagine more critical systems where a system crash will have catastrophic consequences such as:

while(true)
{
    int* arr = new int;
    generateOxygenForAstronauts();
}

Think about the astronauts and free your memory!

WebF0x
  • 165
  • 9
  • That wouldn't continued until the system death as this error comes up:terminate called after throwing an instance of 'St9bad_alloc' what(): std::bad_alloc Aborted – Novin Shahroudi Jan 19 '14 at 19:11
0

A tool that runs for a short period of time and then exits can often get away with having memory leaks, as your example indicates. But a program that is expected to run without failure for long periods of time must be completely free of memory leaks. As others have mentioned, the whole system will bog down first. Additionally, code that leaks memory often is very bad at handling allocation failures - the result of a failed allocation is usually a crash and loss of data. From the user's perspective, this crash usually happens at exactly the worst possible moment (e.g. during file save, when file buffers get allocated).

Mike Woolf
  • 1,210
  • 7
  • 11
0

Well, it is a strange question, since the immediate answer is straightforward: as you lose memory to memory leaks, you can/will eventually run out of memory. How big a problem that represents to a specific program depends on how big each leak is, how often these leaks occur and for how long. That's all there is to it.

A program that allocates relatively low amount of memory and/or is not run continuously might not suffer any problems from memory leaks at all. But a program that is run continuously will eventually run out of memory, even if it leaks it very slowly.

Now, if one decided to look at it closer, every block of memory has two sides to it: it occupies a region of addresses in the address space of the process and it occupies a portion of the actual physical storage.

On a platform without virtual memory, both sides work against you. Once the memory block is leaked, you lose the address space and you lose the storage.

On a platform with virtual memory the actual storage is a practically unlimited resource. You can leak as much memory as you want, you will never run out of the actual storage (within practically reasonable limits, of course). A leaked memory block will eventually be pushed out to external storage and forgotten for good, so it will not directly affect the program in any negative way. However, it will still hold its region of address space. And the address space still remains a limited resource, which you can run out of.

One can say, that if we take an imaginary virtual memory platform with address space that is overwhelmingly larger than anything ever consumable by our process (say, 2048-bit platform and a typical text editor), then memory leaks will have no consequence for our program. But in real life memory leaks typically constitute a serious problem.

AnT stands with Russia
  • 312,472
  • 42
  • 525
  • 765
-1

Nowadays compilers do some optimization on your code before generating the binary. And so single newing without deleting it wouldn't have much of harm.

But in general by doing any "Newing" you should "delete" that portion of memory you've reserved in your program.

And also be aware that simple deleting doesn't guarantee not running out of memory. There are different aspects from O.S. and the compiler side to control this feature.

This link may help you a little And this one too

Novin Shahroudi
  • 620
  • 8
  • 18