0

I have a huge database with many tables in SQL Server. I'm migrating this to Azure database.

The migration methods suggested all over the internet are nice and cute, but since Azure Db doesn't support some things (xml OPEN, sp_addextendedproperty, etc.), I had to rewrite most of my stored procedures, and many of my table definitions and views. Because of this, I chose to migrate the schema and data separately. The schema is already in Azure.

I have many questions about moving the data separately:

1. for migrating data only from SQL Server to Azure Db, is there a better / easier way than bcp? With bcp, I'd have to do it table-by-table. I have 40 tables. There must be a better way... No?

2. should I run BCP on the server, or on my local machine? The latter feels safer. The db is constantly being used by many users, I don't want to hog down its performance.

3. won't bcp mess up my collations or data types?

I'd use something like this:

bcp Table1 out C:\DbExport\Table1.txt -c -U sa -S 1.2.3.4\MyDatabase -P password
marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
vacip
  • 5,246
  • 2
  • 26
  • 54
  • I suspect there is some confusion on what bcp/BULK INSERT does – Panagiotis Kanavos Jun 13 '16 at 12:43
  • @PanagiotisKanavos There might be, as this is the first time I'll use those. :) Bcp gets all the data from a table into a text file (export). Bulk insert can be used to import that data into Azure Db. Am I mistaken? – vacip Jun 13 '16 at 12:45
  • I suggest you check a tutorial on SSIS, data import/export etc before proceeding. Transferring data from a local database to Azure SQL is no different than transferring data from one database to another. You have to understand how DB to DB transfers are performed first, then try to move your database to the cloud – Panagiotis Kanavos Jun 13 '16 at 12:49
  • 1
    Have you checked [this article?](https://azure.microsoft.com/en-us/documentation/articles/sql-database-cloud-migrate/) It explains the various technologies that can be used to migrate to Azure SQL. from the Migration Wizard down to individual data uploads with bcp – Panagiotis Kanavos Jun 13 '16 at 12:52
  • @PanagiotisKanavos Thanks, that is exactly the article I got the idea of `bcp` from. About all the other methods: I didn't even consider them, as I seriously doubted they would be able to rewrite my stored procedures to comply with Azure database restrictions. That seemed like a manual job anyway. – vacip Jun 13 '16 at 12:58
  • The article explains migration technologies, not code rewriting. It doesn't explain how to resolve incompatibilities. *After* you resolve them, you can use the technologies in this article – Panagiotis Kanavos Jun 13 '16 at 13:00
  • Well, yeah, but I **can't resolve all conflicts in place, as some technologies that work in Azure don't work in SQL Server**. Hence I can only use the bcp export + bulk import method. They are incompatible **both ways**! See [how to comment is Azure](http://stackoverflow.com/questions/12458292/how-do-you-comment-your-db-schema-objects-in-sql-azure-database-project) and [how to comment in SQL Server](http://stackoverflow.com/questions/4586842/sql-comments-on-create-table-on-sql-server-2008) So I can't rewrite **and** keep the *old* SQL database working with the *new* code, and then import. – vacip Jun 13 '16 at 13:35
  • @PanagiotisKanavos So *After* I resolve, I can't use most of the technologies in the article. Can I? That was my main question. – vacip Jun 13 '16 at 13:36
  • You miss the point. That article doesn't mention automagic ways to transfer the data. It mentions the various technologies that can be used to transfer data to/from one database to another. You can't decide which one to use if you don't know what they do. SO is not the place to ask for a data integration tutorial. Eg. it should be evident that you can't transfer a live database. Either you go off-line and copy it to the cloud, or you have to transfer a snapshot first and use transaction replication to synchronize the databases until you are ready to switch – Panagiotis Kanavos Jun 13 '16 at 13:42
  • Also, BEFORE you try to move the data to Azure, it should be evident that you should test migration to a local database first, so you can eliminate any incompatibilites with ease. Almost all of the technologies apply, at various points. – Panagiotis Kanavos Jun 13 '16 at 13:44
  • One more thing: did you actually read that article? Because I did. (Like a week ago.) It is not about " the various technologies that can be used to transfer data to/from one database to another." It is specifically about full *database* (not just data) *migration* from *SQL Server to Azure*. – vacip Jun 13 '16 at 14:05
  • Not only that, I know and use all of these technologies daily. Yes, I would fix any incompatibilities first. Yes, I would transfer the schema first, then upload a snapshot of the data. Yes, I would use SSDT to create an SSIS package to automate all that. If I wanted no downtime, I'd also use transactional replication. But I'd try to test the migration proceduer in a lab first, so I won't lose any data or have unexpected downtime when the actual migration takes place. – Panagiotis Kanavos Jun 13 '16 at 14:15
  • Cool. Exactly what I have done so far and how I'm planning on going on after reading Randy's answer. Meh. – vacip Jun 13 '16 at 14:32

2 Answers2

2

Yes there is an alternate way of moving data from one SQL Server into Azure. You could use SSDT (SQL Server Data Tools). The toolset contains features to compare the schemas of two tables and to compare data, which is what you want to do. Then SSDT will synch your Azure table with your source SQL Server table. You can do multiple tables at once. I use this routinely to move data from one SQL Server table to my Azure database(s).

SSDT is free and comes with Visual Studio. However, you don't need to install Visual Studio if you don't currently have it. You can install the Visual Studio shell, which installs just enough of Visual Studio to allow you to run SSDT.

Of course, there are also third party commercial tools (like Red Gate) which will do this too.

Randy Minder
  • 47,200
  • 49
  • 204
  • 358
  • SSDT works like a charm, with a few exceptions where the character coding is giving it an aneurysm. In those cases, I use BCP. Thanks again for the suggestion! – vacip Jun 22 '16 at 13:15
2

You can also use the xSQL Data Compare Tools. With this software you can compare the data between two databases with the same schema and then generate the synchronization scripts. Personally i find it very useful in solving data synchronization problems.

Endi Zhupani
  • 736
  • 2
  • 7
  • 22
  • Thank you. I'll definitely try it the next time (we have another database awaiting migration). For now, SSDT was 85% OK, the rest was done with BCP (SSDT got confused by funny characters in some text fields). – vacip Jul 06 '16 at 14:27