1

I have a small (few hundred MB) SQL Server database running on RDS. I've spent several hours trying to get a copy of it onto my local SQL Server 2014 instance. All of the following fail. Any ideas what might work?

  1. Task -> Backup fails because it doesn't give my admin account permission to backup to a local drive.

  2. Copy Database fails during create package with While trying to find a folder on SQL an OLE DB error was encountered with error code 0x80040E4D

  3. From SSMS, while connected to the RDS server, running BACKUP DATABASE. This fails with message BACKUP DATABASE permission denied in database 'MyDB'. Even after running EXEC sp_addrolemember 'db_backupoperator' for the connected user.

  4. General scripts generates a 700MB .sql file. Running that with sqlcmd -i fails at some point after producing plausible .mdf and .ldf files that can't be mounted on the local server (probably because the sqlcmd failed to complete and unlock them).

feetwet
  • 3,248
  • 7
  • 46
  • 84
  • Create the databases locally manually first and get that working. Then use _Tasks / Generate Scripts. In Advanced, select _Types of data to script = Schema and Data_ and run the resultant script. It would've helped if you'd posted the error messages originally. If this doesn't work, post the error – Nick.Mc Dec 15 '15 at 03:33
  • @Nick.McDermaid - I was able to eventually get that (#4) working in this case (following a series of reboots of the machine and services to get the MDF to attach), but it's not scalable. – feetwet Dec 15 '15 at 19:08
  • Do you really mean "can't be automated"? If you were to post the error perhaps we could help. – Nick.Mc Dec 16 '15 at 01:00
  • @Nick.McDermaid - The OS thinks they're locked by an MS SQL process, and the MS SQL Server thinks it doesn't have or can't get the lock. I could spend time determining a minimal sequence to get them mounted, but for any significant amount of data this seems to be the most kludgy approach. – feetwet Dec 16 '15 at 01:26
  • If you post the code with the problem and the error maybe we can help. Normally you just issue a CREATE DATABASE and then USE it, but you still have to nominate a path anyway so the way I see it, there is some degree of config required – Nick.Mc Dec 16 '15 at 02:59
  • Have you solved the problem? – André Bonna Feb 26 '16 at 14:39
  • @AndréBonna - Sorry, haven't had a chance to try your solution yet. Will update when I do! – feetwet Feb 26 '16 at 15:44

4 Answers4

3

AWS has finally provided a reasonably easy means of doing this: It requires an S3 bucket.

After creating a bucket called rds-bak I ran the following stored procedure in the RDS instance:

exec msdb.dbo.rds_backup_database
   @source_db_name='MyDatabase',
   @s3_arn_to_backup_to='arn:aws:s3:::rds-bak/MyDatabase.bak',
   @overwrite_S3_backup_file=1;

The following stored procedure returns the status of the backup request:

exec msdb.dbo.rds_task_status @db_name='MyDatabase'

Once it finished I downloaded the .bak file from S3 and imported it into a local SQL Server instance using the SSMS Restore Database... wizard!

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
feetwet
  • 3,248
  • 7
  • 46
  • 84
1

The SSIS Import Export Wizard can generate a package to duplicate a whole set of tables. (It's not the sort of Copy Database function that relies on files - it makes a package with data flow components for each table.)

It's somewhat brittle but can be made to work :-)

SSMS Generate Scripts feature can often fail with any large data set as the script for all the data is just to large/verbose. This method never scripts out the data.

onupdatecascade
  • 3,336
  • 22
  • 35
  • I was never able to get SSIS to work against RDS. However, I did manage to get Generate Scripts to work: In `Advanced` settings change `Types of data to script` from `Schema only` to `Schema and data`. Yes, that did create a (now) 1.2GB plain-text file for my 200MB database. But I was able to get it loaded to a local server without too many warnings in 15 minutes using `sqlcmd -i scripted.sql -d MyDatabase -x`. – feetwet Jan 26 '17 at 19:27
1

Check this out: https://github.com/andrebonna/RDSDump

It is a C#.NET Console Application that search for the latest origin database Snapshot, restore it on a temporary RDS instance, generate a BACPAC file, upload it to S3 and delete the temporary RDS instance.

You can transform your RDS snapshot into a BACPAC file, that can be downloaded and imported onto your local SQL Server 2014 instance using the feature answered here (Azure SQL Database Bacpac Local Restore)

Community
  • 1
  • 1
André Bonna
  • 807
  • 8
  • 13
0

Redgate's SQL Compare and SQL Data Compare are invaluable for these types of things. They are not cheap (but worth every penny imo). But if this is a one-time thing, you could use the 14 day trial and see how it behaves for you.

http://www.red-gate.com/products/

E.J. Brennan
  • 45,870
  • 7
  • 88
  • 116