0

Is there a performant way to load very big CSV-files (which have size of several gigabytes) into an SQL-Server 2008 database with .NET?

Stecya
  • 22,896
  • 10
  • 72
  • 102
Elmex
  • 3,331
  • 7
  • 40
  • 64

3 Answers3

7

I would combine this CSV reader with SqlBulkCopy; i.e.

using (var file = new StreamReader(path))
using (var csv = new CsvReader(file, true)) // true = has header row
using (var bcp = new SqlBulkCopy(connection)) {
    bcp.DestinationTableName = "TableName";
    bcp.WriteToServer(csv);
}

This uses the bulk-copy API to do the inserts, while using a fully-managed (and fast) IDataReader implementation (crucially, which streams the data, rather than loading it all at once).

Marc Gravell
  • 1,026,079
  • 266
  • 2,566
  • 2,900
1

Look into using the SQLBulkCopy class.

Ash Burlaczenko
  • 24,778
  • 15
  • 68
  • 99
0

Use SqlBulkCopy http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx

Matt
  • 6,787
  • 11
  • 65
  • 112