I am getting Out of Memory exceptions when retrieving large blob data from SQL Server. I am calling a stored procedure which returns 6 columns of simple data and 1 varbinary(max)
data column.
I am using this code to execute the stored procedure:
m_DataReader = cmd.ExecuteReader(CommandBehavior.SequentialAccess);
and am ensuring that I read columns from the data reader in sequential column order.
See MSDN article on retrieving large data
For the varbinary(max)
column, I am reading the data out like this:
DocBytes = m_DataReader.GetValue(i) as byte[];
What I have noticed is that at the point of Out of Memory, I appear to have 2 copies of the byte array in memory. One is in the DocBytes
array, and the other is in a internal buffer of SqlDataReader
.
Why is there a copy of this? I assumed I would be passing a reference, or is this due to the internal way that SqlDataReader
provides the data - i.e. it always provides a copy?
Is there a more memory efficient way of reading the data from the database?
I have looked at the new .NET 4.5 GetStream
method, but unfortunately, I do not have the ability to pass the stream on - I need the bytes in memory - so I cannot follow other examples of streaming into a file or web response. But I want to try and ensure that only one copy exists in memory at a time!
I have come to the conclusion that this is probably just the way it has to be and that the duplicate copy is just a buffer that has yet to be garbage collected. I really don't want to have to muck about with forcing garbage collection, and am hoping someone has some ideas about alternative approaches.