224

Does anyone know how I can copy a SQL Azure database to my development machine? I'd like to stop paying to have a development database in the cloud, but it's the best way to get production data. I copy my production database to a new development database but I'd like to have that same database local.

Any suggestions?

Praveen Kumar Purushothaman
  • 164,888
  • 24
  • 203
  • 252
BZink
  • 7,687
  • 10
  • 37
  • 55
  • 8
    This has gotten dramatically easier. See Atom's answer below (Feb 2018) explaining how to use **Tasks=>Deploy Database...** in SSMS. – Jon Crowell Oct 23 '18 at 22:30

24 Answers24

145

There are multiple ways to do this:

  1. Using SSIS (SQL Server Integration Services). It only imports data in your table. Column properties, constraints, keys, indices, stored procedures, triggers, security settings, users, logons, etc. are not transferred. However it is very simple process and can be done simply by going through wizard in SQL Server Management Studio.
  2. Using combination of SSIS and DB creation scripts. This will get you data and all missing metadata that is not transferred by SSIS. This is also very simple. First transfer data using SSIS (see instructions below), then create DB Create script from SQL Azure database, and re-play it on your local database.
  3. Finally, you can use Import/Export service in SQL Azure. This transfers data (with a schema objects) to Azure Blob Storage as a BACPAC. You will need an Azure Storage account and do this in Azure web portal. It is as simple as pressing an "Export" button in the Azure web portal when you select the database you want to export. The downside is that it is only manual procedure, I don't know a way to automate this through tools or scripts -- at least the first part that requires a click on the web page.

Manual procedure for method #1 (using SSIS) is the following:

  • In Sql Server Management Studio (SSMS) create new empty database on your local SQL instance.
  • Choose Import Data from context menu (right click the database -> Tasks -> Import data...)
  • Type in connection parameters for the source (SQL Azure). Select ".Net Framework Data Provider for SqlServer" as a provider.
  • Choose existing empty local database as destination.
  • Follow the wizard -- you will be able to select tables data you want to copy. You can choose to skip any of the tables you don't need. E.g. if you keep application logs in database, you probably don't need it in your backup.

You can automate it by creating SSIS package and re-executing it any time you like to re-import the data. Note that you can only import using SSIS to a clean DB, you cannot do incremental updates to your local database once you already done it once.

Method #2 (SSID data plus schema objects) is very simple. First go though a steps described above, then create DB Creation script (righ click on database in SSMS, choose Generate Scripts -> Database Create). Then re-play this script on your local database.

Method #3 is described in the Blog here: http://dacguy.wordpress.com/2012/01/24/sql-azure-importexport-service-has-hit-production/. There is a video clip with the process of transferring DB contents to Azure Blob storage as BACPAC. After that you can copy the file locally and import it to your SQL instance. Process of importing BACPAC to Data-Tier application is described here: http://msdn.microsoft.com/en-us/library/hh710052.aspx.

Rob
  • 26,989
  • 16
  • 82
  • 98
seva titov
  • 11,720
  • 2
  • 35
  • 54
  • 2
    This works, just one correction. In the Server Import/Export wizard the Data source is '.Net Framework Data Provider for SqlServer' – BZink Mar 31 '11 at 01:10
  • Great! Thanks for correction. I remember one of the provider worked and some others did not work for me, but did not recall exactly which one. – seva titov Mar 31 '11 at 01:42
  • Fails for me: 'Failure inserting into the read-only column "id"' – dumbledad Jun 26 '12 at 15:25
  • This has one major problem in that “Import Data” really only does what it says on the tin – it imports the data. It only creates as much of the table structure as necessary and doesn't include any indexes, constraints, etc. – Joey Sep 27 '12 at 10:19
  • @l.poellabauer, did you have your destination database empty before launching the wizard? Means no tables defined? – seva titov Oct 22 '12 at 11:16
  • @SevaTitov yes, I dropped every table. In SQL Azure there are Identity columns, but in my local DB these columns are not automatically set to Identity. The data is here, but the identity-type is not set. – l.poellabauer Oct 22 '12 at 15:08
  • @l.poellabauer, Yes, this is how SSIS works. It only imports data. All column properties, constraints, keys, indices, stored procedures, triggers, security settings, users, logons, etc. are not transferred. One way to deal with this is to generate DB creation script and then re-play this script on imported data. – seva titov Oct 22 '12 at 15:36
  • This will only copy *some* schema. So no Store Procedures will be copied, for example. No Foreign Keys. No Indexes. Just data and part of the table schema will be copied. This is not an acceptable answer to me. – Josh Mouch May 15 '13 at 10:41
  • Yup, I followed suggestion # 1, and it copied the tables and data, but no Primary Keys, Foreign Keys, etc. I ended up trying the SQL Azure tool on CodePlex instead. – Mike Gledhill Mar 25 '14 at 15:11
  • This is how it failed for me: I had DateTimeOffsets in my schema, which are converted to sql variants, and the time is *rounded to minutes*. God, don't you just love databases. They all suck. – John May 24 '14 at 22:12
  • 3
    The trick for me was to start replicating PKs/FKs/constraints on empty DB, then temporarily disable constraints while importing data. More precisely: 1-Create empty target DB manually. 2-Right-click source DB > Tasks > Generate Scripts. 3-Run script file on empty target DB (now DB has correct PKs/FKs/constraints, but no data). 4-Disable all constraints (http://stackoverflow.com/a/161410). 5-Import data (Right-click target DB > Tasks > Import Data). 6-Re-enable constraints. Hope this helps! – Mathieu Frenette May 26 '14 at 22:22
  • This answer is out of date. I found a better answer: use Import Data-tier Application – Weimin Ye Feb 08 '15 at 04:03
  • @WeiminYe, there is a difference between coping DB to another DB and restoring DB from existing backup. You are talking about DB restore operation from BACPAC, which is a final step in method #3, already described in the answer. – seva titov Feb 08 '15 at 18:20
  • What ever happened to the good old back up and restore method? – Zapnologica Mar 24 '16 at 07:31
  • Following Method#2 when I replay the script on the local DB created from SSIS, it says "There is already object named 'Product' in the database." This error makes sense because Method#1 already created the table "Product", and Method#2 is executing scripts containing re-creation of the same table. How do we get around this? – Kagawa May 24 '16 at 04:38
  • @Kagawa, When generating DB script from SSMS, use Advanced Scripting Options dialog and set property `Check for object existence` to `True` – seva titov May 24 '16 at 15:42
  • @sevatitov There's no `Check for object existence` option in Advanced Scripting Options dialog in 2008 R2. Do you know if there's any equivalent setting for 2008 R2? – Kagawa May 25 '16 at 00:06
  • @Kagawa, you might want to check if you have a latest version of SSMS installed. The latest version at this moment is 2016 April preview: https://msdn.microsoft.com/en-us/library/mt238290.aspx – seva titov May 25 '16 at 15:08
  • How can you export the whole thing if you don't own a storage account? – Jo Smo Jun 15 '16 at 21:29
  • 1
    @JoSmo, methods #1 and #2 do not require storage account. However I would recommend to have a storage account in the same datacenter as your SQL Azure database is. The reason is you can use it for DB backups and restore. If you don't have administrative permissions for your Azure subscription, Find someone in your organization who has enough rights to create new storage accounts and give you access keys to it. Since you already have access to database, there is no reason deny access to storage account. – seva titov Jun 16 '16 at 04:53
  • can you tell how or provide a reference on how to use azure storage account and generate `BACPAC` file because the export operation was performed using a DB instance that was changing while the export was on-going. And i am getting [these](http://stackoverflow.com/questions/36065995/cannot-import-sql-azure-bacpac-to-2016-ctp) errors while importing ? or can i use [this](https://kvaes.wordpress.com/2015/09/04/azure-direct-sql-server-to-storage-account-backup/) solution to backup to local ? – Shaiju T Feb 13 '17 at 10:36
139

Copy Azure database data to local database: Now you can use the SQL Server Management Studio to do this as below:

  • Connect to the SQL Azure database.
  • Right click the database in Object Explorer.
  • Choose the option "Tasks" / "Deploy Database to SQL Azure".
  • In the step named "Deployment Settings", connect local SQL Server and create New database.

enter image description here

"Next" / "Next" / "Finish"

m00am
  • 5,910
  • 11
  • 53
  • 69
Atom
  • 1,433
  • 1
  • 7
  • 3
  • 16
    Awesome this is not so clear in SSMS since the "deploy database to SQL AZURE" is somewhat misleading... – EeKay May 17 '18 at 07:45
  • 26
    By far the easiest solution mixed with by far the most misleading menu name. Thank you for posting this. – Kevin Giszewski Jan 25 '19 at 15:10
  • 1
    This is simplest, but the drawback is that you cannot pick which tables you want to backup and which you don't want. In my company, we have an Attachment table that contains Gigabytes of attachments (pictures, etc). Normally we don't want this as it will be forever to copy. We can't exclude this table using this method. – Rosdi Kasim May 04 '20 at 09:56
53

In SQL Server 2016 Management Studio, the process for getting an azure database to your local machine has been streamlined.

Right click on the database you want to import, click Tasks > Export data-tier application, and export your database to a local .dacpac file.

In your local target SQL server instance, you can right click Databases > Import data-tier application, and once it's local, you can do things like backup and restore the database.

Dave Friedel
  • 1,028
  • 1
  • 14
  • 20
Adam Skinner
  • 877
  • 1
  • 8
  • 10
37

I just wanted to add a simplified version of dumbledad's answer, since it is the correct one.

  1. Export the Azure SQL Database to a BACPAC file on blob storage.
  2. From inside SQL Management studio, right-click your database, click "import data-tier application".
  3. You'll be prompted to enter the information to get to the BACPAC file on your Azure blob storage.
  4. Hit next a couple times and... Done!
Community
  • 1
  • 1
Josh Mouch
  • 3,480
  • 1
  • 37
  • 34
22

I think it is a lot easier now.

  1. Launch SQL Management Studio
  2. Right Click on "Databases" and select "Import Data-tier application..."
  3. The wizard will take you through the process of connecting to your Azure account, creating a BACPAC file and creating your database.

Additionally, I use Sql Backup and FTP (https://sqlbackupandftp.com/) to do daily backups to a secure FTP server. I simply pull a recent BACPAC file from there and it import it in the same dialog, which is faster and easier to create a local database.

ThisGuy
  • 2,335
  • 1
  • 28
  • 34
9

You can also check out SQL Azure Data Sync in the Windows Azure Management Portal. It allows you to retrieve and restore an entire database, including schema and data between SQL Azure and SQL Server.

Josh
  • 4,009
  • 2
  • 31
  • 46
Voclare
  • 241
  • 2
  • 4
  • 2
    SQL Data Sync should not be used as part of your backup strategy as there are several limitations. It does not version, it only backs up data and no other objects. For more information, see the SQL Data Sync FAQ topic. (http://msdn.microsoft.com/en-us/library/windowsazure/jj650016.aspx) – Shaun Luttin Jul 26 '13 at 14:52
  • 1
    Data Sync is pretty awful - even today (several years after this post) there are a ton of bugs and it can really screw up your db - it also still has a lot of limitations. – William Jun 26 '18 at 20:26
7

Using msdeploy.exe

Caveat: msdeploy.exe fails to create the destination database on its own, so you need to create it manually first.

  1. Copy the connection string on the database properties page. Adjust it so that it contains a correct password. database properties page
  2. Get the connection string for the destination DB.
  3. Run msdeploy.exe like this:

    "c:\Program Files\IIS\Microsoft Web Deploy V3\msdeploy.exe" -verb:sync -dest:dbDacFx="destination_DB_connection_string",dropDestinationDatabase=true -source:dbDacFx="azure_DB_connection_string",includeData=true -verbose
    

Using SqlPackage.exe

  1. Export the azure DB to a bacpac package.

    "c:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\SqlPackage.exe" /a:Export /ssn:"azure_db_server" /sdn:"azure_db_name" /su:"user_name" /sp:"password" /tf:"file.bacpac"
    
  2. Import the package to a local DB.

    "c:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\SqlPackage.exe" /a:Import /SourceFile:"file.bacpac" /TargetServerName:".\SQLEXPRESS" /TargetDatabaseName:CopyOfAzureDb
    
Ian Kemp
  • 28,293
  • 19
  • 112
  • 138
Gebb
  • 6,371
  • 3
  • 44
  • 56
5

It's pretty easy. This worked for me...in terms of getting an Azure SQL database down onto your local machine...:

  1. Open your SQL Management Studio and connect to your Azure SQL Server.
  2. Select the database you would like to get down onto your local machine, and right-click...select "Generate Scripts". Follow the prompts...

BUT, be careful in that if you ALSO want the DATA, as well as the scripts, be sure to check the Advanced Options before beginning the generating...scroll down to "Types of data to script", and make sure you have "Schema and data"...or whatever is appropriate for you.

It will give you a nice SQL script file which can then be run on your local machine and it will create the database as well as populate it with all the data.

Bare in mind that in my case, I have no FK or other constraints...also, it wasn't a lot of data.

I don't recommend this as a backup mechanism in general...

  • 3
    I have to say this worked remarkably well with no fuss; foreign keys recreated perfectly. The resulting SQL script was too large to open in SSMS, or indeed in many text editors, but I was able to use the command line `sqlcmd /S /d -E -i ` – perlyking Jul 05 '17 at 07:52
5

In SQL Server Management Studio

Right click on the database you want to import, click Tasks > Export data-tier application, and export your database to a local .dacpac file.

In your local target SQL server instance, you can right click Databases > Import data-tier application, and once it's local, you can do things like backup and restore the database.

Ali Sufyan
  • 94
  • 1
  • 3
2

I couldn't get the SSIS import / export to work as I got the error 'Failure inserting into the read-only column "id"'. Nor could I get http://sqlazuremw.codeplex.com/ to work, and the links above to SQL Azure Data Sync didn't work for me.

But I found an excellent blog post about BACPAC files: http://dacguy.wordpress.com/2012/01/24/sql-azure-importexport-service-has-hit-production/

In the video in the post the blog post's author runs through six steps:

  1. Make or go to a storage account in the Azure Management Portal. You'll need the Blob URL and the Primary access key of the storage account.

  2. The blog post advises making a new container for the bacpac file and suggests using the Azure Storage Explorer for that. (N.B. you'll need the Blob URL and the Primary access key of the storage account to add it to the Azure Storage Explorer.)

  3. In the Azure Management Portal select the database you want to export and click 'Export' in the Import and Export section of the ribbon.

  4. The resulting dialogue requires your username and password for the database, the blob URL, and the access key. Don't forget to include the container in the blob URL and to include a filename (e.g. https://testazurestorage.blob.core.windows.net/dbbackups/mytable.bacpac).

  5. After you click Finish the database will be exported to the BACPAC file. This can take a while. You may see a zero byte file show up immediately if you check in the Azure Storage Explorer. This is the Import / Export Service checking that it has write access to the blob-store.

  6. Once that is done you can use the Azure Storage Explorer to download the BACPAC file and then in the SQL Server Management Studio right-click your local server's database folder and choose Import Data Tier Application that will start the wizard which reads in the BACPAC file to produce the copy of your Azure database. The wizard can also connect directly to the blob-store to obtain the BACPAC file if you would rather not copy it locally first.

The last step may only be available in the SQL Server 2012 edition of the SQL Server Management Studio (that's the version I am running). I do not have earlier ones on this machine to check. In the blog post the author uses the command line tool DacImportExportCli.exe for the import which I believe is available at http://sqldacexamples.codeplex.com/releases

dumbledad
  • 16,305
  • 23
  • 120
  • 273
  • While I didn't follow this exactly, it lead me in the right direction. You can skip the download of the bacpac file and point to the Azure Blob storage directly from inside SQL Server Management Studio. – Josh Mouch May 15 '13 at 10:53
2

Regarding the " I couldn't get the SSIS import / export to work as I got the error 'Failure inserting into the read-only column "id"'. This can be gotten around by specifying in the mapping screen that you do want to allow Identity elements to be inserted.

After that, everything worked fine using SQL Import/Export wizard to copy from Azure to local database.

I only had SQL Import/Export Wizard that comes with SQL Server 2008 R2 (worked fine), and Visual Studio 2012 Express to create local database.

rsh
  • 21
  • 1
2

The accepted answer is out of date. I found a better answer: Use Import Data-tier Application

More detailed information please see this article: Restoring Azure SQL Database to a Local Server

Weimin Ye
  • 665
  • 4
  • 15
2

You can try with the tool "SQL Database Migration Wizard". This tool provide option to import and export data from azure sql.

Please check more details here.

https://sqlazuremw.codeplex.com/

Krishna_K_Batham
  • 310
  • 1
  • 4
  • 10
2

I always use Import/Export Feature which seems to be the easiest one among all.

Step 1:

Get the backup from the azure instance as follows, Select the database → Right click → Tasks → Export Data Tier Application.

Step 2: Give a specific name for the backup file and save it in your desired location

Step 3: That's it you have taken a backup of the database from sql instance to your local. Lets restore it to the local. Copy the backed up database to your C drive. Now open the PowerShell with administrator rights and navigate to C drive

Step 4: Lets download the powershell script to remove the master keyRemoveMasterKey.ps1 have the script on the same drive in this case its C.

Step 5 : Run the script as follows, .\RemoveMasterKey.ps1 -bacpacPath "C:\identity.bacpac"

That's it, now you can restore it on MSSQL 2017 in your local environment.

Step 6: Connect to your local server, and click Databases → Import-Data-Tier-Application

Step 7 : Give a name for your database to restore.

Now you will see everything in green!

Read my blog with diagrams.

Sajeetharan
  • 216,225
  • 63
  • 350
  • 396
2

Using SSMS v18.9+, you can use the Deploy Database to Microsoft Azure SQL Database.

It's not really intuitive, but the wizard allow you to select a local db even if the name of the task is Deploy Database to Microsoft Azure SQL Database.

  1. Connect to your Azure Database using SSMS
  2. Right-click on the database, select Tasks > Deploy Database to Microsoft Azure SQL Database. enter image description here
  3. Select your local SQL server as the target connection. enter image description here
  4. Follow the additional steps.
Thomas
  • 24,234
  • 6
  • 81
  • 125
2

Looks like the functionality is missing from Management Studio in 2022. Here is how I do it using Azure Data Studio:

  1. Install Azure Data Studio
  2. Open Azure Data Studio
  3. Install the extension called Admin Pack for SQL Server
  4. Restart Azure Data Studio
  5. Setup connections for both source database (Azure) and destination (local machine)
  6. Execute statement in your local DB Engine on master database:
sp_configure 'contained database authentication', 1; 
GO  
RECONFIGURE;
GO
  1. Right click on source database and select Data-tier Application Wizzard Select Export to .bacpac file [Export bacpac] (4th, last option in my current version)
  2. Finish the wizard
  3. Right click on destination database and select Data-tier Application Wizzard
  4. Select Import from .bacpac [Import bacpac] (3rd option in my version)
  5. Select the .bacpac file previously created and finish the wizard

After importing the logins and users are created, but their default schema isn’t set. If default shcema is required that needs to be handled manually. In that case: Open an admin connection to the local copy and run: USE [imported-db]; ALTER USER imported-user WITH DEFAULT_SCHEMA = whatever;

Hari
  • 4,514
  • 2
  • 31
  • 33
  • I had to put the commands `sp_configure 'contained database authentication', 1; GO; RECONFIGURE; GO;` on separate lines – DLT Sep 27 '22 at 23:26
1

You can use the new Azure Mobile Services to do a nightly backup export from SQL Azure to a .bacpac file hosted in Azure Storage. This solution is 100% cloud, doesn't require a 3rd party tool and doesn't require a local hosted SQL Server instance to download/copy/backup anything.

There's about 8 different steps, but they're all easy: http://geekswithblogs.net/BenBarreth/archive/2013/04/15/how-to-create-a-nightly-backup-of-your-sql-azure.aspx

Ben Barreth
  • 1,675
  • 1
  • 15
  • 16
0

The trick for me was to start replicating PKs/FKs/constraints on empty DB, then temporarily disable constraints while importing data (see https://stackoverflow.com/a/161410).

More precisely:

  1. Create empty target DB manually;
  2. Right-click source DB > Tasks > Generate Scripts;
  3. Run script file on empty target DB (now DB has correct PKs/FKs/constraints, but no data);
  4. Disable all constraints;
  5. Import data (Right-click target DB > Tasks > Import Data);
  6. Re-enable constraints.

Hope this helps!

Community
  • 1
  • 1
0

Now you can use the SQL Server Management Studio to do this.

  • Connect to the SQL Azure database.
  • Right click the database in Object Explorer.
  • Choose the option "Tasks" / "Deploy Database to SQL Azure".
  • In the step named "Deployment Settings", select your local database connection.
  • "Next" / "Next" / "Finish"...
ferhrosa
  • 1,714
  • 2
  • 11
  • 6
  • I couldn't do this because that "Next" button is greyed out until I select another Azure database as the target connection – Colin Jun 15 '17 at 11:05
0

Use the Import/Export service in SQL Azure to create a .bacpac file.

Then take a look at this method in another Stack Overflow article.

Azure SQL Database Bacpac Local Restore

Community
  • 1
  • 1
0

If anyone has a problem to import a Bacpac of a DB that uses Azure SQL Sync, Sandrino Di Mattia developed a great simple application to solve this.

  1. Export a Bacpac of your DB
  2. Dowload Di Mattia's binary
  3. With this console app repair the downloaded Bacpac
  4. Lauch SSMS
  5. Right Click on "Databases" and select "Import Data-tier Application"
  6. Select the repaired Bacpac.
Géza
  • 2,492
  • 1
  • 17
  • 12
0

If anyone wants a free and effective option (and don't mind doing it manually) to backup database to Local then use schema and data comparison functionality built into the latest version Microsoft Visual Studio 2015 Community Edition (Free) or Professional / Premium / Ultimate edition. It works like a charm!

I have BizPark account with Azure and there is no way to backup database directly without paying. I found this option in VS works.

Answer is taken from https://stackoverflow.com/a/685073/6796187

Community
  • 1
  • 1
Crennotech
  • 521
  • 5
  • 8
0

Hi I'm using the SQLAzureMW tool for SQLAzure DB migration and management. Very useful one. It was downloaded from codeplex, but currently it's not available the codeplex will be going to shutdown, the same application tool is now available in GttHub. This below link is explain how to use this tool and also available the Application for download.

https://github.com/twright-msft/azure-content/blob/master/articles/sql-database/sql-database-migration-wizard.md

0

Download Optillect SQL Azure Backup - it has 15-day trial, so it will be enough to move your database :)

Optillect Team
  • 1,737
  • 2
  • 11
  • 12
  • This is a link to an internal tool. Assume OP meant to link to the now-retired Red Gate Tool (http://www.red-gate.com/products/dba/sql-azure-backup/) or this project on CodePlex (http://sqlazurebackup.codeplex.com/) – Josh Jan 10 '17 at 23:25