1

Can anybody give the performance & behavior difference between using BatchSize and without using BatchSize in OracleBulkCopy. Currently I don't specify BatchSize. My BatchSize may vary from 100 to 200000. Please suggest the best approach.

try
{
     var columnMapping =
         from table in orgTable.Columns.Cast<DataColumn>()
         select
             new OracleBulkCopyColumnMapping(table.ColumnName, table.ColumnName);
     using (var bulkcopy
         = new OracleBulkCopy(ConnectionString, OracleBulkCopyOptions.Default))
     {                    
         bulkcopy.DestinationTableName = GoldOrgTable;                                        
         foreach (var mapping in columnMapping)
             bulkcopy.ColumnMappings.Add(mapping);
         bulkcopy.BulkCopyTimeout = TimeOut.Value;
         bulkcopy.WriteToServer(orgTable);
         orgTable.Dispose();
     }
     return true;
}
Steven Jeuris
  • 18,274
  • 9
  • 70
  • 161
user1887523
  • 25
  • 1
  • 6

1 Answers1

1

Optimum batchsize should be set around 500~5000.

The pro of using bulk copy is :

  • increase the performance(reduce required time) of insertion into db
  • manipulate with heavy insertion (many rows of insertion) on the fly

However it also have some cons that we need to concern:

  • its harder to troubleshoot when there is a row in the bulk copy is hitting error
  • Might not know if the insertion or truncation of data might be having some data type collation or conversion issue(unicode and non-unicode), some data might not inserted correctly(dirty data) even though the bulk copy process is successfully done

Further discussion you can find it on the following website: http://docs.oracle.com/html/E10927_01/OracleBulkCopyClass.htm#CHDCDEIA or Bulk Insert to Oracle using .NET

Community
  • 1
  • 1
Watlaoyeh
  • 38
  • 5