4

I have a binary file with about 185k rows in it. C# parses file in seconds. What would be the best way to update MSSQL table with that data?

What I've tried:

  1. Easiest way - read binary row, parse, update table. The whole process takes around 2 days to update all the data.
  2. Combine 200 update queries and send them at once to the MSSQL. In this case, update takes 8 to 10 hrs.
  3. Combine 500+ queries into single one. Works faster, but drops timeout exceptions time to time, so some updates are not going through.

Any advice on how to speed up the update process?

Alex D
  • 739
  • 8
  • 23
  • Are these updates/inserts/deletes? Also, how many indexes do you have on this table? How many total rows are there? Is the table partitioned? – Adriaan Stander Sep 06 '12 at 03:42
  • See [SqlBulkCopy](http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx) for transferring the data. If needed this could be to a [temporary] table and then `MERGE`ed or whatnot: then the update performance primarily becomes a function of the database itself. Also see [this post about updates](http://stackoverflow.com/questions/1700487/using-sqlbulkcopy-to-insert-update-database). –  Sep 06 '12 at 03:47
  • Also, make sure that transactions are being correctly. There should be *little difference* between running 100 updates for 1 record and 1 update for 100 records if run under a transaction (this isn't entirely true, but it's my starting rule of thumb). –  Sep 06 '12 at 03:55
  • See also (various interesting related questions) http://stackoverflow.com/questions/4889123/any-way-to-sqlbulkcopy-insert-or-update-if-exists , http://stackoverflow.com/questions/4935846/sqlbulkcopy-or-bulk-insert , http://stackoverflow.com/questions/11840522/bulk-data-insertion-in-sql-server-table-from-delimited-text-file-using-c-sharp –  Sep 06 '12 at 04:01

3 Answers3

2

Not sure you really want to do it via C#: probably want to use BULK INSERT and give it a file with your data properly formatted.

Femi
  • 64,273
  • 8
  • 118
  • 148
  • 3
    The [SqlBulkCopy](http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx) is likely better for *from* C# .. –  Sep 06 '12 at 03:46
  • 1
    To use bulk insert, you must be able to put the data file on the file system of the SQL Server's PC, and the file must be accessible (Access Rights) by sqlservr.exe. (Not in C:\Users\User) – linquize Sep 06 '12 at 03:59
  • @pst would you repost your comment as an answer so I can accept it? SqlBulkCopy + Merge gave me the best results – Alex D Sep 11 '12 at 22:28
1

I would try table-valued parameter:

http://www.codeproject.com/Articles/22392/SQL-Server-2008-Table-Valued-Parameters

pkExec
  • 1,752
  • 1
  • 20
  • 39
1

Use SqlBulkCopy (to a temporary table) followed by MERGE.

The bulk-copy method can efficiently transfer data using a "push" from a .NET client. Using BULK INSERT requires a "pull" from the server (along with the required local-file access).

Then the MERGE command (in SQL Server 2008+) can be used to insert/update/upsert the data from the temporary table to the target table according to the desired rules. Since the data is entirely in the database at this point, this operation will be as fast as can be. Using MERGE may also result in performance advantages over many individual commands, even those within the same transaction.

See also:

Community
  • 1
  • 1