2

I have a function which uses 0.5 MB memory each time I run it. So I decided to investigate it step by step by watching the Windows task manager at the same time. I noticed after these lines:

int **banned;
banned=new int*[vertices];
for(i=0;i<vertices;i++)
    banned[i]=new int[k_colors];

It uses 0,5 MB memory. Then I decided to delete it before the return line:

for(i=0;i<vertices;i++)
   for(j=0;j<k_colors;j++)
       delete []banned[j];
   delete[]banned;

It was 8,5 MB memory using beginning of the function. After allocation, it became 9 MB, but after the delete part, it was still 9 MB. And I execute this function in the whole program 1000 times. Then it is getting killed by OS. Any idea why is that and how can I solve it?

EDIT: Here the main() part:

int main()
{
    srand(time(0));
    input();
    initialize();
    for(int i = 0; i < MAX_GENERATION; i++)
    {
        parents = selection(TS);
        population = cross_over(parents, PC);
        mutation(PM);
        elite=tabu_search(population);
        elitism(); //270 MB memory using each time.

    }
fclose(pFile);
return 0;
}

Above, in elitism() function's first line are allocation part, and last lines are delete part.

Who Cares
  • 205
  • 5
  • 18
  • 2
    `delete[]` doesn't _return the memory to the OS_. – πάντα ῥεῖ Nov 09 '14 at 08:53
  • Ok, then how can I do that? – Who Cares Nov 09 '14 at 08:54
  • Depends on what your OS offers for such. But I doubt there is a suitable way. – πάντα ῥεῖ Nov 09 '14 at 08:55
  • Same thing happens in unix too. I uploded my work to there. after a few minutes it gets killed. the terminal just says `Killed` – Who Cares Nov 09 '14 at 08:57
  • 1
    It doesn't really matter whether it returns the memory to the OS. What matters is that the memory gets returned in such a way that future allocations can re-use that memory. Whether that's by returning it to the OS or by internally marking it as unused is irrelevant. Either way, you're right that repeatedly running that same function should not cause a repeated increase in memory use, that does appear to indicate a leak somewhere. –  Nov 09 '14 at 08:58
  • @hvd I think you are right but I am about the being blind here. I watched every step in the task manager and it happens because of this allocation. Do you think that I have to move the allocation part to out of the general loop in the `main()`? is it works? – Who Cares Nov 09 '14 at 09:01
  • Have you tried running the program through [Valgrind](http://valgrind.org/)? – GoBusto Nov 09 '14 at 09:03
  • maybe dupe: http://stackoverflow.com/questions/22323037/returning-dynamically-allocated-memory-back-to-os-without-terminating-the-progra – M.M Nov 09 '14 at 09:03
  • 3
    you are not `delete` the same things you are `new`ing in this code, btw – M.M Nov 09 '14 at 09:04
  • @WhoCares There's not enough in your question for me to give a more useful answer. (Well, I see there's enough for Matt McNabb to spot a problem, but not for me. :)) If you have a complete sample program that you can put in your question that repeatedly allocates and then deallocates memory, shouldn't fail unless the first iteration fails, but does fail, much later, that would really help get a good answer. –  Nov 09 '14 at 09:05
  • @WhoCares - Does your program leak memory? If not, then there is nothing to worry about. The C++ heap manager is what is controlling whether memory is actually returned to the OS or not. If you called that function 1,000 times where you allocate and deallocate the memory, the memory allocated doesn't increase after the first time (or it shouldn't). – PaulMcKenzie Nov 09 '14 at 09:07
  • @GoBusto I don't know what it is, but certainly I am going to. – Who Cares Nov 09 '14 at 09:13
  • @hvd I updated it. I hope it works. – Who Cares Nov 09 '14 at 09:13
  • @PaulMcKenzie but it is increasing – Who Cares Nov 09 '14 at 09:14
  • @MattMcNabb I am about the try what you said. hold on. – Who Cares Nov 09 '14 at 09:14
  • 1
    @WhoCares - That is because your code is wrong. You loop one way to allocate, then something totally different to deallocate. In other words, you lost track and didn't reverse your steps properly. – PaulMcKenzie Nov 09 '14 at 09:16
  • @PaulMcKenzie Ok then, what can I do to do it properly? – Who Cares Nov 09 '14 at 09:28
  • 1
    @WhoCares - The answer given to you below by MattMcNabb fixes the problem. – PaulMcKenzie Nov 09 '14 at 09:29
  • @PaulMcKenzie aha. ok I have forgotten to try iy. Now it worked. Thanks a lot guys. – Who Cares Nov 09 '14 at 09:31

2 Answers2

6

To use delete[] properly you should delete the same things that you new'd:

for(i=0;i<vertices;i++)
    delete [] banned[i];

delete[] banned;

Your "process getting killed" is probably because your original code caused crazy amounts of undefined behaviour, deleting the same pointer multiple times and so on.

This version may or may not release memory to the operating system; that is a decision made by your compiler/library and the operating system. On some systems, the memory may appear to still be allocated to your process, but the OS will be able to claim it if another process needs it.

If you call the same function over and over it should not accumulate memory though; the previously delete'd blocks can answer the new call.

M.M
  • 138,810
  • 21
  • 208
  • 365
  • Hey again. This worked in windows very good but in linux it returns seg. fault. Do you know why? – Who Cares Nov 09 '14 at 10:40
  • You have a bug in the code that you didn't post. Try debugging your program. In linux use gdb and valgrind. – M.M Nov 09 '14 at 10:41
  • Now I debugged it by valgrind. Here is the thing; I deleted the code you gave me. It worked in linux well. With this shape, I ran it with valgrind, and it gave an error which was about the `fscanf()`. However, They are very different part of program. I mean `fscanf()` and `delete` part. And also, valgrind didn't tell me anything about the code you gave me if I add it again. It still says sth about `fscanf()`. Do you think they are related somehow? – Who Cares Nov 09 '14 at 10:53
  • btw I used it with `valgrind --leak-check=yes`line. – Who Cares Nov 09 '14 at 10:57
  • start a new question `fscanf` has nothing to do with this one. It sounds like you have heap corruption . Make sure you post a MCVE otherwise people are just guessing. – M.M Nov 09 '14 at 10:58
1

When you allocate memory in your application, the C++ runtime will ask the OS for "more memory". Under the assumption that you will allocate more memory again, "if there isn't HUGE amounts of memory freed - hold on to it". In other words, "task manager" and other such tools are not ideal for understanding exactly how much memory is ACTUALLY allocated in your application.

However, delete [] does indeed work in all commercial grade released compiler environments.

The problem in your code is that your are deleting things you didn't allocate.

You call delete [] for each of the vertices and once for the whole array, matching your new calls. [Every place you have a new, there should be exactly the same delete - same number of times, same pointer]

It is also possible that you are "fragmenting" the memory - for example something like this

size_t s = 100;
for(;;)
{
    int *p = new int[s];

    ...
    delete [] p; 
    s += 10;
}

So, the freed memory is "too small" for the next allocation.

Of course, this whole mess could be avoided by using

vector< vector <int> > banned(vertices);
for(i=0;i<vertices;i++)
   banned[i].resize(k_colors);

And now the memory used gets cleaned up automatically.

Mats Petersson
  • 126,704
  • 14
  • 140
  • 227
  • If by "work" you mean "impossible to use without introducing endless hideous bugs and strictly inferior in every way to other approaches", then yes. That's a rather strange definition of "work", though. – Puppy Nov 09 '14 at 09:13
  • 3
    Uh, that's a different point. The fundamentals for most of the container types is `new` and `delete` [although typically wrapped into an allocator class, but that's just sugar on top] – Mats Petersson Nov 09 '14 at 09:17