21

On my laptop, running 64 bit Windows 7 and with 2 Gb of free memory (as reported by Task Manager), I'm able to do:

var x = new Dictionary<Guid, decimal>( 30 * 1024 *1024 );

Without having a computer with more RAM at my hands, I'm wondering if this will scale so that on a computer with 4 Gb free memory, I'll be able to allocate 60M items instead of "just" 30M and so on?

Or are there other limitations (of .Net and/or Windows) that I'll bump into before I'm able to consume all available RAM?

Update: OK, so I'm not allowed to allocate a single object larger than 2 Gb. That's important to know! But then I'm of course curious to know if I'll be able to fully utilize all memory by allocating 2 Gb chunks like this:

  var x = new List<Dictionary<Guid, decimal>>();
  for ( var i = 0 ; i < 10 ; i++ )
    x.Add( new Dictionary<Guid, decimal>( 30 * 1024 *1024 ) );

Would this work if the computer have >20Gb free memory?

Dan Byström
  • 9,067
  • 5
  • 38
  • 68

2 Answers2

18

There's a 2 GiB limitation on all objects in .NET, you are never allowed to create a single object that exceeds 2 GiB. If you need a bigger object you need to make sure that the objects is built from parts smaller than 2 GiB, so you cannot have an array of continuous bits larger than 2 GiB or a single string longer larger than 512 MiB, I'm not entirely sure about the string but I've done some testing on the issue and was getting OutOfMemoryExceptions when I tried to allocate strings bigger than 512 MiB.

These limits though are subject to heap fragmentation and even if the GC does try to compact the heap, large objects (which is somewhat of an arbitrary cross over around 80K) end up on the large object heap which is a heap that isn't compacted. Strictly speaking, and somewhat of a side note, if you can maintain short lived allocations below this threshold it would be better for your overall GC memory management and performance.

Dan Byström
  • 9,067
  • 5
  • 38
  • 68
John Leidegren
  • 59,920
  • 20
  • 131
  • 152
  • Thank you very much for this info! I made an update to my question. Feel free to comment on that before before I mark this answer as accepted. :-) – Dan Byström May 24 '11 at 09:08
  • @danbystrom - As I said, this depends on heap fragmentation. The large object heap can become fragmented, when this happens you end up with holes in memory space (this is fragmentation). If a memory request doesn't fit into any of the available holes and the system cannot free up more memory, you'll run into a OutOfMemoryException, despite having more than enough system total memory available. This is due to how memory is always allocated as a continuous blocks of bytes (there's no exception to this). The general rule here is, keep your allocations short lived and small. – John Leidegren May 24 '11 at 10:06
  • I really cannot answer if 30 MiB is a good choice or not, as it depends on the memory access patterns of your application. If you feel that there's a problem there are tools for diagnosing these issues, such as memory profilers. They can provide the necessary insight, if you don't already know what the problem is. – John Leidegren May 24 '11 at 10:07
  • OK, I'm well aware of heap fragmentation problems. I was more curious to find out if there are any nasty surprises with Virual Machines limitations and such. For now I'll live on with the assumption that I'll be able to utilize all free memory on the machine - until proven wrong! – Dan Byström May 24 '11 at 11:53
  • I'm no expert on virtual machines, in my opinion and if done right, there's not supposed to be a difference for software between virtual machines and non-virtual machines. Especially at the level of abstraction you're at in CLR world (managed environment). – John Leidegren May 24 '11 at 12:29
  • Oh, I was using the term Virtual Machine referring to the CLR itself being a Process Virtual Machine. – Dan Byström May 24 '11 at 13:50
  • Yes, you can call that a VM if you like, most refer to it as the Common Language Runtime to avoid confusion (which is the Microsoft implementation of the Common Language Infrastructure not to be confused with Mono). Patrick Dussud is the man behind the garbage collector used in the CLR, he has done talks about GC at PDC (http://www.microsoftpdc.com/2009/FT51), you can find videos of him talking about GC and memory managment where he goes into the details. You can also check out some of the talks from past Gamefest on how to play nice with memory managment in a managed environment. – John Leidegren May 24 '11 at 14:11
  • NET 4.5 will now has the option in x64 to explicitly allow objects to be larger than 2gb by setting gcAllowVeryLargeObjects in the app.config. – kmote Nov 05 '14 at 23:21
10

Update: The 2Gb single-object memory limit has been lifted on 64 bit with the release of .NET 4.5.

You'll need to set gcAllowVeryLargeObjects in your app.config.

The maximum number of elements in an array is still 2^32-1, though.

See Single objects still limited to 2 GB in size in CLR 4.0? for more details.

Community
  • 1
  • 1
Eldritch Conundrum
  • 8,452
  • 6
  • 42
  • 50