4

I am currently designing a website in C#, which uses some very complex code to generate a search list, which I am currently designing as a tree-like structure.

The main reason I am using a tree is the fact that this website is high-traffic and has a very complex search filter, which means scalability is very important. However, I am worried that the memory requirements of the tree may outweigh the effective processing requirements of simply recalculating values every time.

Is there a reliable way to measure the size of a dictionary in C#? The Marshal.SizeOf() method will not allow this as the code is not unmanaged.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Ed James
  • 10,385
  • 16
  • 71
  • 103
  • +1 I had the same problem: http://stackoverflow.com/questions/751710/how-to-find-out-size-of-asp-net-session-when-there-are-non-serializable-objects – Vilx- Jun 10 '09 at 11:02
  • 1
    Consider using the memory profiler; it's purpose is to help you analyze your use of memory. http://msdn.microsoft.com/en-us/library/ms979205.aspx – Eric Lippert Jun 10 '09 at 15:03

3 Answers3

3

The best bet is to run the load on the site under different models and check the relevant performance counters.

As a simplification, you could extract just the code that creates and stores data structures, and embed that into a console application. Run a simulated load and check the performance counters there. You can vary things and measure the impact on memory consumption and garbage collection. Doing this would isolate the effects you want to measure.

If you're thinking, "gee, that sounds like a lot of work," then you are better off just buying more memory.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Cheeso
  • 189,189
  • 101
  • 473
  • 713
  • 1
    That does sound like a lot of work, but I'm not sure that buying more memory is going to work, given that there can easily be a few hundred thousand objects in the dictionary! – Ed James Jun 10 '09 at 11:17
  • Not really too much work. Write the code that simulates different scenarios and strategies. Run each strategy through multiple sessions, each session for 5-10 minutes or more, and query the important perf counters every 15 or 20 seconds. At the end of the run, automatically produce excel sheets and charts with the output for each cycle. When you have all this set up, Push the start button. Go to lunch... Come back 3 hours later. – Cheeso Jun 10 '09 at 11:26
  • That sounds like a good plan, I'll give myself a day to do it later in the build! – Ed James Jun 10 '09 at 11:48
0

One way to do it ... initialize your tree with an arbitrary number of elements, measure the memory size consumed by the process, add 1000 new elements, measure the memory size again, subtract and divide.

Alexandru Nedelcu
  • 8,061
  • 2
  • 34
  • 39
0

You could to create a small app with a single Dictionary, and then analyze it in several scenarios using WinDbg or ClrProfiler.

I don't think there is a way to get the total size of the object in run-time (apart from traversing internal fields using reflection?)

vgru
  • 49,838
  • 16
  • 120
  • 201