2

I wrote a Hello-world console application using C#

class Program
{
  static void Main(string[] args)
  {
    Console.WriteLine("Hello World");

    Console.Read();
  }
}

, and when I launched it, the memory it taken is: enter image description here

Then, I tried to create a dump file of this process: enter image description here

After dump is created, the memory this process took is: enter image description here

And you could see a big change for working set size, which is a suprise to me.

Something else interesting about this memory increment are:

  1. after my process memory working set increase to 46 MB, seems it didn't decrease back to 7 MB any more.
  2. the size of dump file is not 7MB, but 46 MB.
  3. Private working set increase from 1.83MB to 2.29MB

And, here is my questions:

1.why there is a memory increment along with create dump operation?

2.In the past, when tester report a memory leak issue and send me a dump file, I treat the dump file size as the memory size that the target process took in lab environment. But from the simple example above, seems I was always wrong?

3.I am super curious about what's inside the 46MB dump file (another way to say, I am super curious about why current hello world application took 46 MB memory). I am familiar with SOS commands, like !DumpHeap or !eeheap, but these commands are not enough to tell everything inside the 46MB size file. Could anyone share some useful tools, links or instructions?

Thanks very much for any help !

Tiger.Xing
  • 491
  • 5
  • 14
  • 1
    see related:http://stackoverflow.com/questions/1984186/what-is-private-bytes-virtual-bytes-working-set you will notice that your private bytes and commit size did not change much, it is private bytes that you should be looking at but even then it is a little more complicated if you read the linke – EdChum Sep 19 '14 at 10:10
  • 1
    To answer your questions, the increase is due to the memory dump generation and will include memory mapped information but not strictly using it currently hence your private bytes has not increased much, the final dump file may be the total memory your process used but it depends on the flags used to generate the dump as additonal handle info and other stuff may be in the dump: http://blogs.msdn.com/b/debugger/archive/2009/12/30/what-is-a-dump-and-how-do-i-create-one.aspx – EdChum Sep 19 '14 at 10:14

1 Answers1

5

In 95% of the cases (or even more, I don't have statistics about it), you needn't worry about the Working Set. It is rather misleading that Microsoft chose Working Set columns as the default columns of Task Manager.

The memory an application needs is called virtual memory. You can distinguish three different types of virtual memory:

  1. reserved memory, which neither exists on in RAM nor on disk
  2. committed memory, which is currently not needed and therefore written to disk
  3. committed memory, which is currently needed and therefore available in RAM for the CPU to access. This is called Working Set.

There are many reasons why Windows decreases or increases the Working Set. In many cases the reason can be found in other applications than yours. However, in the case you describe, it is quite obvious:

  1. the debugger suspends the process
  2. the debugger creates a new thread and triggers the MiniDumpWriteDump function
  3. That function needs to read all committed memory in order to write it into the dump
  4. When accessing the virtual memory, those parts of the memory, which have been paged to disk, need to be paged back to RAM.
  5. As we learned before, virtual memory in RAM is called Working Set, so the Working Set increases

However, it still depends on what other things you are doing with your PC while creating the dump. Try running a multi-threaded application that makes heavy use of memory while you are dumping the process. You will probably find that Windows considers that situation and before paging in memory, it will page out memory of your own process. In that case, the Working Set will not increase as much as it did in your screenshot.

So, again: don't worry about the Working Set. Windows just thought it had enough RAM left to speed up the dump creation process. It is likely that it has used memory which was otherwise unused or decreased the amount of RAM used for disk cache.

If you want to have a look at the RAM contents, try SysInternals RAMMap. Repeat the dump creating under different circumstances, e.g. while copying files or while doing memory-heavy calculations.

Thomas Weller
  • 55,411
  • 20
  • 125
  • 222