I use SqlBulkCopy to insert approximately 3.7 million rows and throws error
Exception of type 'System.OutOfMemoryException' was thrown
Here is my code. The columns are added dynamically, otherwise it is a straight forward code.
using (var bulkCopy = new SqlBulkCopy(connection))
{
connection.Open();
using (var reader = ObjectReader.Create(inputRecordsToProcess))
{
bulkCopy.BatchSize = 100000;
bulkCopy.BulkCopyTimeout = 0;
bulkCopy.DestinationTableName = schemaName + "." + tableName;
var classProperties = GetClassPropertiesByAttributes<T>();
var i = 0;
foreach (var property in classProperties)
{
bulkCopy.ColumnMappings.Add(property.Item2, property.Item1);
i++;
}
try
{
bulkCopy.WriteToServer(reader);
}
catch (Exception ex)
{
throw;
}
}
}
One important point is, I was able to insert ~3.6 million rows, but throws exception when it goes over that. Do I need to make any changes to the code?
This occurs on all servers (dev, prod and even local)