2

I've to INSERT a lot of rows (more than 1.000.000.000) to a SQL Server data base. The table has an AI Id, two varchar(80) cols and a smalldatetime with GETDATE as default value. The last one is just for auditory, but necesary.

I'd like to know the best (fastest) way to INSERT the rows. I've been reading about BULK INSERT. But if posible I'd like to avoid it because the app does not run on the same server where database is hosted and I'd like to keep them as isolated as posible.

Thanks!

Diego

Diego
  • 16,436
  • 26
  • 84
  • 136

3 Answers3

4

Another option would be bcp.

Alternatively, if you're using .NET you can use the SqlBulkCopy class to bulk insert data. This is something I've blogged about on the performance of, which you may be interested in as I compared SqlBulkCopy vs another way of bulk loading data to SQL Server from .NET (using SqlDataAdapter). Basic example loading 100,000 rows took 0.8229s using SqlBulkCopy vs. 25.0729s using the SqlDataAdapter approach.

AdaTheDev
  • 142,592
  • 28
  • 206
  • 200
  • Thanks for the answer. I'm still reading the links. About bcp: I'd need to: A) Create a file in the database server from the app server or B) acces the app server from database server. That is what I ment with keeping them isolated. I'd like (if posible) to avoid it. – Diego Nov 08 '10 at 11:48
  • Right now SqlBulkCopy is the best solution. I've reduced my times in a 80%. If any other way appears this will be the accepted one! – Diego Nov 08 '10 at 12:23
  • " if posible I'd like to avoid it because the app does not run on the same server where database is hosted and I'd like to keep them as isolated as posible." It is in the question – Diego Nov 09 '10 at 11:27
1

Create an SSIS package that will copy the file to SQL server machine and then use the data flow task to import data from file to SQL server database.

subhash
  • 276
  • 1
  • 1
  • Sorry I didn't mention it in the question but the SSIS package is also something that we wanted to avoid. – Diego Nov 08 '10 at 13:06
0

There is no faster/more efficient way than BULK INSERT and when you're dealing with such large ammount of data, do not even think about anything from .NET, because thanks to GC, managing millions of object in memory causes massive performance degradation.

Pavel Urbančík
  • 1,466
  • 9
  • 6
  • Right now I'm actually using SqlBulkCopy class and it is working great! I've a batch size of 50000 and I'm handling it so after the batch dataTable be cleared. So I don't actually have BIG ammount of data i memory – Diego Nov 08 '10 at 14:41
  • May I ask how long it takes to load 1 billion records? I'm just curious, I've never dealt with such insanely large ammount of data. – Pavel Urbančík Nov 08 '10 at 15:04
  • Right now my best times are: 48s for almost half million. – Diego Nov 09 '10 at 11:28