0

I am running a program in VB.NET, with VS 2013, in 64-bit architecture, and have enabled allowverylargeobjects.

I have a list of objects of a class. The class has various properties that are data, kind of like

Class cMyClass
Property desc1 as String
Property desc2 as String
Property value as Double
End Class

I am populating this list via a read from SQL server. I can successfully put, in debug or release mode, 100 million objects of this class in the list, and operate on them just fine. At one point, though, I am populating the list with 150 million objects. When I run the program through Visual Studio in debug mode (or even in release mode, but through VS), I have no problems populating the list with 150 million objects. But when I use an executable (compiled from release mode), it throws an error at this point (the error box tells me it is in a particular subroutine where the only thing happening is the filling of this list) - "exceededSystem.OutOfMemoryException: Array dimensions exceeded supported range."

I get that it's bad practice to load so much stuff into memory, but I am already very far down this road and need to solve it just once. I can solve it, clearly, by running the program through VS, but would like to understand why this works for me in VS (in debug mode or release mode) but not when running the executable.

I'll add that I don't think it's a hardware problem. The program is using over 20gb memory when running, but it's running on a box with 128gb RAM.

Thank you.

braX
  • 11,506
  • 5
  • 20
  • 33
Jimmy
  • 337
  • 4
  • 13
  • 2
    Do you think *populating the list with 150 million objects* might be related to an out of memory exception? – Ňɏssa Pøngjǣrdenlarp Mar 16 '18 at 19:42
  • @Plutonix Yes, I do think that. But that's what I don't understand. I can populate it with 150 million objects when running the program in VS. It's only when I run the executable that I have problems. See edits re physical memory, too. – Jimmy Mar 16 '18 at 19:44
  • 1
    Possibly related https://stackoverflow.com/questions/1087982/single-objects-still-limited-to-2-gb-in-size-in-clr-4-0 There is also apparently an option in 4.5 to turn off the 2GB limit. – TyCobb Mar 16 '18 at 19:52
  • @TyCobb Appreciate the reference, but I've already turned off that limit, and the program happily consumes over 20gb memory. It's somewhwere just above that that it's dying. And, as noted, the machine has 128gb memory available, and it's not hitting that limit when I check in the task manager. – Jimmy Mar 16 '18 at 19:55
  • @Jimmy I understand that and apparently isn't your issue, but 20GB memory usage means nothing in terms of what I posted. That post was specifically for individual objects. i.e. a single instance of `Foo` consuming more than 2GB. A `List` is still just an array underneath. – TyCobb Mar 16 '18 at 19:58
  • @TyCobb Understood good point; the 20gb memory consumption isn't strictly relevant. In this particular case, I know in what order things are happening, and that only a couple gb of that are consumed by objects other than the list in question. And as noted, the option in app.config gcallowverylargeobjects is set to true, so I shouldn't be hitting 2GB object limit, I would think. – Jimmy Mar 16 '18 at 20:04
  • If I remember correctly, the Framework allocates memory for the List structure once a certain utilization percentage of the List is reached; I think what ends up happening is the List structure doubles its allocation size. If the Framework is trying to allocate a continuous block of memory for the expanded List -- that contiguous block of memory may not be available due to memory fragmentation, and thus the `Out of Memory` exception. – codechurn Mar 16 '18 at 20:07
  • When you are running your code in VS, is that a different machine than where the binary is being executed? Is the binary being executed on a Server? I ask these questions because the Garbage collector is much less aggressive on a Server than on a Workstation machine; you can change this setting however. See: https://learn.microsoft.com/en-us/dotnet/framework/configure-apps/file-schema/runtime/gcserver-element – codechurn Mar 16 '18 at 20:10
  • @codechurn - Interesting re contiguous memory. To answer your q's: yes, same machine for VS and binary. It is a server, though, in the sense it's operating on Windows Server 2012 R2. Are you suggesting I enable server garbage collection? – Jimmy Mar 16 '18 at 20:14
  • If the Operating System is a Windows Server SKU, it will be running a less aggressive version of the Garbage Collector ``, so you would want to turn it off. Try calling `GCSettings.IsServerGC` and see what that returns; if it returns true, the change the `. to force the Workstation mode which is much more aggressive. You can also try to force Garabe Collection by doing a `GC.Collect(); GC.WaitForPendingFinalizers();` – codechurn Mar 16 '18 at 20:29
  • If you can predict the size of list (i.e. 150 million), try creating it with the initial size to prevent re-allocation during which you have both the original backing array and the new backing array memory block requirements. -- `New List(Of cMyClass)(150000000)` – TnTinMn Mar 16 '18 at 21:56
  • Thanks for these ideas. Re gcServer, I checked, and it returns False both in VS and in binary. – Jimmy Mar 16 '18 at 22:02

1 Answers1

0

Enable gcAllowVeryLargeObjects in your exe.config file (https://learn.microsoft.com/en-us/dotnet/framework/configure-apps/file-schema/runtime/gcallowverylargeobjects-element)

Even when this is active, you still have a limit on the number of elements:

  • 4,294,967,295 in a multi-dimensional array
  • 2,146,435,071 in a single dimensional array
  • 2,147,483,591 for single byte arrays

Note that as stated in the comment from Tycobb, the gcAllowVeryLargeObjects works at object level, not at process level - so your process might use 20 gbs of RAM that are made up by the sum of many objects < 2 GB.

Mik1893
  • 317
  • 4
  • 14