I' d like to know what is the fastest way to insert data into an Sql Server Database. Currently I' m using SqlCommand with "INSERT" command, but it takes a lot of time, cause I have to insert a lot of rows like 500k. Could you suggest me what it' s the best solution in my case? I have also read that I could use SqlBulkCopy, do you think it could be a good solution?
-
Try it to do using entity framework and pass data using model https://msdn.microsoft.com/en-us/library/bb386876(v=vs.110).aspx – Thili77 Mar 04 '17 at 16:16
-
What does your code look like? How much time is "a lot"? – Brandon Kramer Mar 04 '17 at 16:17
-
possible duplicate from http://stackoverflow.com/questions/5940225/fastest-way-of-inserting-in-entity-framework – Xavave Mar 04 '17 at 16:21
-
What is the role played by MySql in your problem? – Steve Mar 04 '17 at 16:26
-
Sorry sirs, I' m new in the forum and I thought that it was the correct tag because it was suugested. Sorry again – Antonio Mar 04 '17 at 16:27
-
1@Xavave it could be if we know that EF is involved here. – Steve Mar 04 '17 at 16:27
-
Are you inserting row by row, or do you batch multiple inserts into one command? – CSharpie Mar 04 '17 at 16:45
3 Answers
If you are talking about routine 'programmatic' data imports, SqlBulkCopy is the way to go in my experience. Otherwise, database admin tricks on the server itself would probably more suitable.
Microsoft SQL Server includes a popular command-line utility named bcp for quickly bulk copying large files into tables or views in SQL Server databases. The SqlBulkCopy class allows you to write managed code solutions that provide similar functionality. There are other ways to load data into a SQL Server table (INSERT statements, for example) but SqlBulkCopy offers a significant performance advantage over them.
One place I worked for we used to import and merge about a million records every day. We used a combination of SqlBulkCopy and stored procedures. A bulk copy would import to a 'scratch' table and stored procedures would do a INSERT INTO SELECT or UPDATE SELECT (SO has questions with good examples) to merge the data. Th whole thing used to take under a minute (maybe two).
With SQL, you can copy information from one table into another. The INSERT INTO SELECT statement copies data from one table and inserts it into an existing table.
I found BulkCopy to be hard to maintain but the performance benefit makes it bearable. Also, if you have a separate table (matching the DataTable) to receive the import, it is easier to manage.
-
I tried with SqlBulkCopy and it works very well. I have a question, if I add rows trough SqlBulkCopy, do I have to select data and merge data with the new rows, or WriteToServer() funciton merge data automatically? – Antonio Mar 04 '17 at 17:00
-
WriteToServer() does not merge data AFAIK. The solution I suggest is that after bulk import you run a data specific INSERT INTO SELECT or UPDATE SELECT WHERE statement, for example only update the rows that have a different value of certain field or timestamp. – mtmk Mar 04 '17 at 17:53
-
Just edited the answer with a link to an UPDATE SELECT statement. Just remembered that's what we used to use to merge the data. – mtmk Mar 04 '17 at 17:59
Use bulk copy it will increase performance significantly
See here for reference: https://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy(v=vs.110).aspx

- 63
- 2
- 7
Use SQLBulkCopy, before that make sure all non clustered index on the table you are inserting record should be dropped then re-create it, it will be much faster for your case.

- 194
- 1
- 1
- 8