15

I have an out of memory exception using C# when reading in a massive file

I need to change the code but for the time being can I increase the heap size (like I would in Java) as a shaort term fix?

Jack Kada
  • 24,474
  • 29
  • 82
  • 106
  • 1
    Are you reading the file in all at once is it possible to read and deal with the data a little at a time. – rerun Feb 24 '10 at 10:58
  • I believe .net has Memory Maps now as well, http://blogs.msdn.com/b/salvapatuel/archive/2009/06/08/working-with-memory-mapped-files-in-net-4.aspx – Kelly Elton Sep 21 '12 at 15:19

5 Answers5

8

.Net does that automatically.

Looks like you have reached the limit of the memory one .Net process can use for its objects (on 32 bit machine this is 2 standard or 3GB by using the /3GB boot switch. Credits to Leppie & Eric Lippert for the info).

Rethink your algorithm, or perhaps a change to a 64 bit machine might help.

GvS
  • 52,015
  • 16
  • 101
  • 139
  • 1
    The process limit is 2GB on 32-bit, unless you use the /3GB boot switch. – leppie Feb 24 '10 at 11:04
  • I vaguely remember that .NET applications on 32 bit systems are limited to 800MB, or something. – OregonGhost Feb 24 '10 at 11:05
  • 2
    @OregonGhost: Chances are if you allocating an 800MB continuous block (like an array), you will not have enough virtual space left. – leppie Feb 24 '10 at 11:07
  • 6
    Let's be precise here. One process can have *arbitrarily* much *memory* -- the per-process limit on 32 bit windows is sufficiently huge that you'd never in practice get to it. Of all that memory, only 2GB of it can be *mapped into user address space* at any one time. Or, if you have the 3GB switch, 3GB can be mapped. (The 3GB switch does not give you 4GB of address space; it's called "the 3GB switch" for a reason.) Since the .NET heap is always mapped into user address space, that limits the address space automatically made available to managed programs. – Eric Lippert Feb 24 '10 at 16:05
  • @Eric Lippert, edited again. I hope my simple answer is kind of correct now. – GvS Feb 24 '10 at 16:13
  • .NET 4.5 now allows more than 2GB on 64bit compiled processes only. – Piotr Kula Apr 16 '15 at 08:46
5

No, this is not possible. This problem might occur because you're running on a 32-bit OS and memory is too fragmented. Try not to load the whole file into memory (for instance, by processing line by line) or, when you really need to load it completely, by loading it in multiple, smaller parts.

aevitas
  • 3,753
  • 2
  • 28
  • 39
Steven
  • 166,672
  • 24
  • 332
  • 435
2

No you can't see my answer here: Is there any way to pre-allocate the heap in the .NET runtime, like -Xmx/-Xms in Java?

For reading large files it is usually preferable to stream them from disk, reading them in chunks and dealing with them a piece at a time instead of loading the whole thing up front.

Community
  • 1
  • 1
Paolo
  • 22,188
  • 6
  • 42
  • 49
2

As others have already pointed out, this is not possible. The .NET runtime handles heap allocations on behalf of the application.

In my experience .NET applications commonly suffer from OOM when there should be plenty of memory available (or at least, so it appears). The reason for this is usually the use of huge collections such as arrays, List (which uses an array to store its data) or similar.

The problem is these types will sometimes create peaks in memory use. If these peak requests cannot be honored an OOM exception is throw. E.g. when List needs to increase its capacity it does so by allocating a new array of double the current size and then it copies all the references/values from one array to the other. Similarly operations such as ToArray makes a new copy of the array. I've also seen similar problems on big LINQ operations.

Each array is stored as contiguous memory, so to avoid OOM the runtime must be able to obtain one big chunk of memory. As the address space of the process may be fragmented due to both DLL loading and general use for the heap, this is not always possible in which case an OOM exception is thrown.

Brian Rasmussen
  • 114,645
  • 34
  • 221
  • 317
0

What sort of file are you dealing with ?

You might be better off using a StreamReader and yield returning the ReadLine result, if it's textual.

Sure, you'll be keeping a file-pointer around, but the worst case scenario is massively reduced.

There are similar methods for Binary files, if you're uploading a file to SQL for example, you can read a byte[] and use the Sql Pointer mechanics to write the buffer to the end of a blob.

Russ Clarke
  • 17,511
  • 4
  • 41
  • 45