3

I use SqlBulkCopy to insert approximately 3.7 million rows and throws error

Exception of type 'System.OutOfMemoryException' was thrown

Here is my code. The columns are added dynamically, otherwise it is a straight forward code.

using (var bulkCopy = new SqlBulkCopy(connection))
{
    connection.Open();

    using (var reader = ObjectReader.Create(inputRecordsToProcess))
    {
        bulkCopy.BatchSize = 100000;
        bulkCopy.BulkCopyTimeout = 0;
        bulkCopy.DestinationTableName = schemaName + "." + tableName;

        var classProperties = GetClassPropertiesByAttributes<T>();
        var i = 0;

        foreach (var property in classProperties)
        {
            bulkCopy.ColumnMappings.Add(property.Item2, property.Item1);
            i++;
        }

        try
        {
            bulkCopy.WriteToServer(reader);
        }
        catch (Exception ex)
        {
            throw;
        }
    }
}

One important point is, I was able to insert ~3.6 million rows, but throws exception when it goes over that. Do I need to make any changes to the code?

This occurs on all servers (dev, prod and even local)

halfer
  • 19,824
  • 17
  • 99
  • 186
superachu
  • 823
  • 3
  • 16
  • 34
  • is that the jackson `ObjectReader` you're using? does the call stack in the exception give away any info whether it is related to reading or writing? there are some similar questions already (like this http://stackoverflow.com/q/12819151/1132334) but they are all related to problems with the data source. – Cee McSharpface Jan 23 '17 at 17:54
  • If I had to guess I would say it could be a bug in the implementation of the reader you are using. Can you provide more details about it in the question as a edit? – Scott Chamberlain Jan 23 '17 at 19:05
  • Thanks all for your comments, i tried to debug more focused (which i didn't do before putting this question), I'm seeing the error occurs while converting the object types. I'm trying to convert object type to strong type DTO, and in the foreach loop it throws the Memory Exception error. I'm trying to figure out a way, will keep you posted. Appreciate all your response, it gives some kick to my brain to also think on the southern side :) – superachu Jan 23 '17 at 20:06

1 Answers1

2

Thank you all for your response.

I figured out what was the problem. Firstly, it didn't occur when calling the SQL Bulk Copy, which I assumed it could have been because, in previous run, I was able to run few 1000 less records than the current one, so I thought the SQL Bulk Copy had limitations or need to settings in order to handle large number of records. When I debugged the code, I found that there is a method call before calling the database insert to "convert object type to strong type", meaning, I have a dynamic class object which I need to convert to strong type object, so I loop all records in that method to convert and it fails there. But still, that method worked earlier with over 3 million records.

So after googling, I found that the culprit was "32-bit" platform. The application was set to run in "Any CPU", but the "Prefer 32-bit" was checked. When I unchecked, I was able to run the records and insert into the database using SQL Bulk Copy!

When i went thru this link's answer, I changed it https://social.msdn.microsoft.com/Forums/en-US/ace563e2-66fe-4666-9f04-cbfc90ab59bc/system-out-of-memory-exception-due-to-mismanaged-memory-using-threads?forum=csharplanguage

Thank you all for taking time to read thru my post! Appreciate it!

superachu
  • 823
  • 3
  • 16
  • 34