0
  • hundreds of csv
  • for every csv I have to save thousands of rows data into a sql table
  • I'm using c# .net, dataset, tableadaptes as you can read in the pseudo code below
  • Performance are bad: circa 100 rows/second. How can I improve my code?

Pseudo code

var dsMaster = new DS_Master();
var el99TestTableAdapter = new EL_99_TESTTableAdapter();

Cycle 
    var el99TestRow = dsMaster.EL_99_TEST.NewEL_99_TESTRow();
    dsMaster.EL_99_TEST.Rows.Add(el99TestRow);
end Cycle

el99TestTableAdapter.Adapter.Update(dsMaster.EL_99_TEST);
Massimo Variolo
  • 4,669
  • 6
  • 38
  • 64
  • Is SSIS not an option here? If not, building an parameterizing a large block of SQL would execute the fastest that I'm aware of. However, there's a limit on how many parameters you can have in a query (32,000 I think) so you'd still need to batch them if you took that approach. – DiskJunky Oct 10 '17 at 10:21
  • Possible duplicate: https://stackoverflow.com/questions/1050112/how-to-read-a-csv-file-into-a-net-datatable. – jsanalytics Oct 10 '17 at 10:33
  • @jstreet, it's not really a duplicate of that - OP is asking for an efficient way to import the CSV, not *how* to import a CSV – DiskJunky Oct 10 '17 at 10:48

1 Answers1

0

This is a tested-great solution: <5 seconds for 85000 rows

private void BulkSaveCsvData(DataTable dt, string destinationTableName)
    {
        using (var bulkCopy = new SqlBulkCopy(_dbConnecion, SqlBulkCopyOptions.Default))
        {
            foreach (DataColumn col in dt.Columns)
            {
                bulkCopy.ColumnMappings.Add(col.ColumnName, col.ColumnName);
            }
            bulkCopy.BulkCopyTimeout = 600;
            bulkCopy.DestinationTableName = destinationTableName;
            bulkCopy.WriteToServer(dt);
        }
    }
Massimo Variolo
  • 4,669
  • 6
  • 38
  • 64