0

Is there a fast, efficient way in VB.NET/ADO.NET to insert large amount of data from Geneneric.List(Of Integer) into SQL Server table besides looping thru the list and issuing individual INSERT commands? I am limited to .NET 3.5 and SQL Server 2005.

Thanks!

Yuriy Galanter
  • 38,833
  • 15
  • 69
  • 136
  • [SqlBulkCopy](http://msdn.microsoft.com/en-us/library/System.Data.SqlClient.SqlBulkCopy.aspx)? (need a datatable). [Table Value Parameters](http://www.codeproject.com/Articles/39161/C-and-Table-Value-Parameters)? – Steve Jun 19 '13 at 20:00
  • 1
    @Steve - TVPs are 2008+ – Martin Smith Jun 19 '13 at 20:01
  • Wasn't aware of `SqlBulkCopy` class, thanks! Does need datatable or reader, but I think this still will be fastest way. Can't use TVP, don't think SQL Server 2005 support those – Yuriy Galanter Jun 19 '13 at 20:04
  • @MartinSmith SqlBulkCopy seem like a way to go, now my question is what is the fastest way to convert list to datatable/reader? Yes I know there're answers to that on SO as well, but they require external libraries and deal with lists of objects with different properties. In my case it's a list of integers, I was wondering if there was a faster less complicated way. – Yuriy Galanter Jun 19 '13 at 20:31
  • You could use the extension method [`CopyToDataTable`](http://msdn.microsoft.com/en-us/library/bb396189.aspx) on an `IEnumerable` (which is valid for a `List` too). You have to use .NET 3.5 or higher and reference `System.Data.DataSetExtensions.dll` assembly and import the `System.Data` namespace. – Styxxy Jun 19 '13 at 20:40
  • @Styxxy Unfortunately for that *The parameter of the input parameter source can only be of type DataRow or a type derived from DataRow.* – Yuriy Galanter Jun 19 '13 at 20:50
  • Sorry, my bad. I did indeed overlook that, I must be tired to overlook such a straight forward thing (it is even described). – Styxxy Jun 19 '13 at 22:43
  • Why can't you simply create a DataTable yourself and add the records of your list yourself? I don't think you'll find much faster things around. – Styxxy Jun 19 '13 at 22:58
  • I think I will have to. Unfortunately I am stuck with generic list as a source and was hoping to avoid loops altogether. Now at least it will not be DB INSERT inside of the loop – Yuriy Galanter Jun 20 '13 at 00:07

1 Answers1

0

Ship XML with all the changes to a stored procedure.

Here is an old example here:

http://granadacoder.wordpress.com/2009/01/27/bulk-insert-example-using-an-idatareader-to-strong-dataset-to-sql-server-xml/

Here is a smaller example, but shows the basics.

http://www.mindfiresolutions.com/Sending-Multiple-Records-As-XML-To-SQL-Server-Stored-Procedure-1861.php

Send xml to stored procedure. Shred the xml to a @variable or #temp table. Do your UPDATES / INSERTS (or MERGE/UPSERT) using the @variable or #temp table.

http://weblogs.asp.net/dwahlin/archive/2009/09/30/passing-multiple-records-to-a-stored-procedure-in-sql-server.aspx

Another example.

What I like to do is create a strong dataset. Put your data into the strong dataset. Then send the ds.GetXml() to the stored procedure.

That way, you get strong typing (using the strong dataset), and you don't have to write your own xml-maker, you piggy back off of .GetXml(). Hint: After creating the strong dataset, remove the namespace (tempuri or something like that)

granadaCoder
  • 26,328
  • 10
  • 113
  • 146
  • Thanks for the detailed possibilities. But the matter still remains of the original source being Generic.List. – Yuriy Galanter Jun 19 '13 at 21:14
  • You convert the List to xml. That's why I mentioned the "I use a strong dataset". That's where I would loop over the List and add rows to a table in the dataset. – granadaCoder Jun 19 '13 at 21:17
  • Seems a bit of an overkill for a bulk insert. If I need to loop thru list to create datatable/dataset I might as well use the datatable in the bulk insert mentioned above. – Yuriy Galanter Jun 19 '13 at 21:30