5

I'm getting an SystemOutOfMemoryException when creating an Array. Yet the length of my array does not exceed Int32.MaxValue.

This is the code (please don't judge the code, its not my code an at least 7 years old)

Dim myFileToUpload As New IO.FileInfo(IO.Path.Combine(m_Path, filename))
Dim myFileStream As IO.FileStream
Try
    myFileStream = myFileToUpload.OpenRead
    Dim bytes As Long = myFileStream.Length //(Length is roughly 308 million)
    If bytes > 0 Then
        Dim data(bytes - 1) As Byte // OutOfMemoryException is caught here
        myFileStream.Read(data, 0, bytes)
        objInfo.content = data
    End If
Catch ex As Exception
    Throw ex
Finally
    myFileStream.Close()
End Try

According to this question "SO Max Size of .Net Arrays" and this question "Maximum lenght of an array" the maximum length is 2,147,483,647 elements Or Int32.MaxValue And the maximum size is 2 GB

So my total length of my array is well within the limits ( 308 million < 2 billion) and also my size is way smaller then that 2 GB presented (filesize is 298 mb).

Question: So my question, with regards to arrays what else could cause a MemoryOutOfMemoryException?

Note: For those wondering the server still has some 10gb free ram space

Note 2: Following dude's advice I monitored the amount of GDI-Objects on several runs. The process itself never exceeds the count 1500 objects.

Community
  • 1
  • 1
User999999
  • 2,500
  • 7
  • 37
  • 63

2 Answers2

3

byte array is bytes in a sequence. That means you have to allocate so many memory as your array is length in one block. If your memory fragmentation is big than the system is not able allocate the memory even you have X GB memory free.

For Example on my machyne I'm not able allocate mere than 908 000 000 bytes in one array, but I can allocate 100 * 90 800 000 without any problem if it is stored in more arrays:

// alocation in one array

byte[] toBigArray = new byte[908000000]; //doesn't work (6 zeroes after 908)

// allocation in more arrays
byte[][] a=new byte[100][]; 

for (int i = 0 ; i<a.Length;i++) // it works even there is 10x more memory needed than before
{
    a[0] = new byte[90800000]; // (5 zeroes after 908) 
}
Aik
  • 3,528
  • 3
  • 19
  • 20
  • I recently read about something like that (My memory allocation is just too big). Something along the lines of an array needing a continious part of memory.. I'm trying right now to import the file in a chuncks. Hopefullyt that will solve the problem – User999999 Aug 07 '14 at 13:14
  • Yes chuck it. Your system simply can't allocate the array... That said you are then assigning it to another object as a whole... May get the same issue... – Paul Kohler Aug 07 '14 at 13:27
  • Yea. Looks like i'm looking at quite a code-adjustment "*Sigh*" – User999999 Aug 07 '14 at 13:51
1

You can read/write the data in place without loading into memory first. Just System.IO.File.Copy the file if you don't want to change the original.

Dim strFilename As String = "C:\Junk\Junk.bmp" 'a big file
Using fs As New FileStream(strFilename, FileMode.Open)
  Dim lngLength As Long = fs.Length
  fs.Seek(lngLength \ 2, SeekOrigin.Begin)
  For l As Long = 0 To lngLength \ 4
    Dim b As Byte = CByte(fs.ReadByte())
    b = Not b
    fs.WriteByte(b)
  Next
End Using
MsgBox("Finished!")

See also: http://msdn.microsoft.com/query/dev11.query?appId=Dev11IDEF1&l=EN-US&k=k%28System.IO.FileStream.WriteByte%29;k%28TargetFrameworkMoniker-.NETFramework

SSS
  • 4,807
  • 1
  • 23
  • 44
  • Quite a valuable solution(hence the upvote). Yet knowing the context of the program, this would result in a rather big adjustment. Chuncking it on the other doesn't require that much changes (although still not to be underestimated). – User999999 Aug 08 '14 at 06:31
  • you should some buffer and read file in chunk because otherwise it will be incredibly slow. For example you can use BufferedStream wrapper – Aik Aug 08 '14 at 09:11