0

I have a WinForm application that periodically polls a TCP server and downloads some user data (JSON notation). For some reason, the memory usage of this application increases with every call of the method below:

    private void timerElapsed(object sender, ElapsedEventArgs e)
    {
        if (!isPolling)
        {
            isPolling = true;

            try
            {
                using (System.Net.WebClient wc = new System.Net.WebClient())
                {
                    jsonTemp = wc.DownloadString(serverUrl);

                    isPolling = false;
                }
            }
            catch (Exception ex)
            {
                isPolling = false;
            }
        }
        else
        {
            isPolling = false;
        }
    }

Whenever wc.DownloadString is being called, the footprint of my application increases.

Since WebClient already implements IDisposable, it should automatically be disposed after the using directive, or am I wrong?

lenniep
  • 679
  • 4
  • 11
  • 27
  • How are you checking memory usage? Virtual memory may increase for example due to creating a lot of temporary objects even when the garbage collector disposes them. It is just not returned to the system because it might be not necessary. – Konrad Kokosa Apr 29 '14 at 11:37
  • At the moment, I am only observing the values in Windows Task Manager. However, also .Net Memory Profiler reports increasing values. I think it's check the memory usage again with perfmon to get additional information. – lenniep Apr 29 '14 at 11:52
  • 2
    yes, I would continue with perfmon analysis and would take full memory dump after long running of the program (to have clear situation). – Konrad Kokosa Apr 29 '14 at 11:57

1 Answers1

1

Well, that is to be expected. The memory usage will only go down after a garbage collection. This is not C or Pascal - the memory isn't released when the variable goes out of scope or when the using block ends - memory is only released under memory pressure (the .NET model makes allocations extremely cheap, while collections are relatively expensive, and don't actually depend much on how much memory in total you are freeing - thus, it's much faster to sacrifice a few MiBs to hold garbage in memory a little longer and release 100 objects at once, rather than doing the collection a hundred times).

How big numbers are we talking about here? MiBs aren't something to be concerned about, it's only when it does climb steadily on average (over hundreds of thousands of calls) that you should start looking for a leak. Also, what kind of memory are you checking? Private memory? Virtual memory? Commited memory?

A possible memory leak might be occurring if the garbage collection starts while there's pinned handles in the memory - that prevents heap compaction from hapenning. You can see this quite easily in CLRProfiler. However, again, this is really only important if the heap fragmentation gets too high.

As for IDisposable and the GC, I've got this answer: Does the "using" keyword mean the object is disposed and GC'ed?

Community
  • 1
  • 1
Luaan
  • 62,244
  • 7
  • 97
  • 116
  • Better a delayed response than no response at all.. Luaan has pointed in the right direction - seems like the GC let the objects live a little longer, while running a test during multiple days, the memory usage stopped to climb. – lenniep Jan 26 '15 at 05:58