I seek advice on either proving or dispelling a belief that is held in my team (apparently without reason). The believe is that starting a new .Net application process is expensive memory-wise (20MB and up per-process). While I point out that the app clearly isn't using that much (as seen in the memory profiler), they counter-argue that it is not the app, but the .Net Framework's runtime that consumes the memory.
This is based on something they've all heard somewhere, so no solid proof exist, but the belief is extremely ingrained in the team. I've googled around, but I can't find any serious analysis of per-process cost of .Net Framework's runtime. While I simply can't accept that each .Net process is that expensive (though I'm willing to admit I may be wrong on this), I do not know enough to prove my point. My teammates on the other hand don't know enough to prove me wrong. Does anyone know of any research/analysis on the matter?
Thank you.