16

I have an ASP.NET app that takes multimegabyte file uploads, writes them to disk, and later MSSQL 2008 loads them with BCP.

I would like to move the whole thing to Azure, but since there are no "files" for BCP, can anyone comment on how to get bulk data from an Azure app into SQL Azure?

I did see http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx but am not sure if that applies.

Thanks.

Snowy
  • 5,942
  • 19
  • 65
  • 119
  • 1
    If the uploads are from an ASP.NET web page you will hit the same problems I have if the data is larger than 30 MB, see my question http://stackoverflow.com/questions/6402253/how-to-upload-huge-files-to-the-azure-blog-from-a-web-page for more information. –  Jun 19 '11 at 12:21
  • Please note that this article was written on an earlier architecture behind SQL Azure. Today, it does not quite work the same way (in the DTU or vCore models). Bulk Copy is a more appropriate way to try to load data quickly into SQL Azure since it reduces round trips, groups writes into larger transactions, and takes more extensive locks when inserting to avoid having to do the overhead of locking each and every row being inserted. – Conor Cunningham MSFT Jan 15 '19 at 00:24
  • Hi @ Conor Cunningham MSFT : Is there any updated MS article available? I'm currently using SqlBulkCopy exclusively to load data quickly into an Azure SQL Database. – Mitch Wheat Jan 20 '19 at 04:21
  • I will go look through the content and see if we have something more current. I co-authored a white paper awhile back on this but it may also have aged out. Things move more quickly in the cloud ;). Bulk insert is the fastest way to load into SQL Azure, however, but there are nuances around batch/txn size, how close the client is to SQL Azure, and whether you need to bulk insert into a permanent table vs. a temp table (latter is much faster to insert in premium/v-core models since it does not have to replicate across nodes on commit). – Conor Cunningham MSFT Jan 21 '19 at 00:05

4 Answers4

7

BCP is one way to do it.

This post explains it in three easy steps: Bulk insert with Azure SQL

Jeffrey Rosselle
  • 745
  • 1
  • 9
  • 25
  • BCP an 80gb file to azure takes 12 hours. If I bulk insert locally it takes 2 hours. Is there any tips like bcping from a local file within azure? – MIKE Dec 01 '17 at 23:11
  • 1
    @MIKE have you tried running the command form and Azure VM inside the same datacenter? You could put the 80gig file inside an Azure File storage folder, which would make it available for the VM. – Jeffrey Rosselle Dec 04 '17 at 10:09
  • Sounds like a clever idea. I'll try this. Do you know if azure throttles the bcp (slows down inserts) on standard tier vs premium tier? While inserting i noticed it does about ten 1000 inserts then a 30 second delay, I'm not sure if it's being throttled by my local machine or azure. (not my local computer, it is a server) – MIKE Dec 04 '17 at 18:23
  • @MIKE I do not have any experience with it, so couldn't tell. You might want to upgrade your Azure SQL database to an stronger tier since you want some high amount of DTU. This could potentially slow it down. – Jeffrey Rosselle Dec 05 '17 at 08:41
  • I moved to vm in same warehouse and bcp is very fast now. I have not tested bcp speeds from standard to premium yet but I did run into a problem on standard where tempdb threw an error for being too large when i was removing duplicates from a large table, i had to switch to premium which didn't complain – MIKE Dec 08 '17 at 17:06
3

You are on the right track. The Bulk Copy API will work. I am using it. And it's the fastest way to import data because it uses INSERT BULK statements. Not to get confused with the BULK INSERT statement which is not supported in SQL Azure. In essence, BCP and the SqlBulkCopy API use the same method.

Herve Roggero
  • 5,149
  • 1
  • 17
  • 11
2

http://www.solidq.com/sqj/Pages/2011-May-Issue/Migrating-Data-into-Microsofts-Data-Platform-SQL-Azure.aspx for a detailed analysis of options available

paras_doshi
  • 1,027
  • 1
  • 12
  • 19
1

I think it's important to note that BCP cannot handle source files that are Unicode while using a format file to do your import.

Reza Shirazian
  • 2,303
  • 1
  • 22
  • 30