0

We have a database created on SQL Server 2017 (Express edition) on one of our servers and we are trying to move this database to our another server which has SQL Server 2014 (licensed version) installed on it. We have tried to restore the backup, detach/attach database files and also tried generating the script and running them on new server.

But unfortunately we are not able to restore the database. For backup/restore, we are getting following error message.

enter image description here

For generating scripts and running them on new server; the problem is that the file size of the scripts is around 3.88 GB. We are not able to edit this file before executing it because file size is too large. We have also tried to first generate scripts without data and then with data but the file size with only Data comes up to 3.88 GB (there is only very little difference when only Data is selected).

What are the options we have?

WAQ
  • 2,556
  • 6
  • 45
  • 86
  • Downgrading is much harder than upgrading, why are you downgrading to an unsupported version at all? – HoneyBadger Aug 05 '20 at 13:10
  • 2
    Surely the create scripts are not 3.88GB? – HoneyBadger Aug 05 '20 at 13:11
  • Like the error message says "A downgrade path is not supported". The easiest approach would be to upgrade the 2012 instance to version 2017 or later. Alternatives would be: (1) script out the data in smaller files or (2) experiment with [data import/export wizard](https://learn.microsoft.com/en-us/sql/integration-services/import-export-data/import-and-export-data-with-the-sql-server-import-and-export-wizard) in combination with [linked servers](https://learn.microsoft.com/en-us/sql/relational-databases/linked-servers/linked-servers-database-engine). – Sander Aug 05 '20 at 13:14
  • I can't believe that the `CREATE` scripts are close to 4GB in size either. That would mean the definitions of your objects, plus configuration settings, is almost 2 **billion** characters in size. – Thom A Aug 05 '20 at 13:23
  • If your database definition truly is that big, and your data really that small, I would suggest you have a huge amount of denormalisation issues. – Thom A Aug 05 '20 at 13:24
  • File size with only schema is merely a few MBs so i can easily run it and create database on the target system. but the real problem is to move the database with Data because we have lot of information in the database. – WAQ Aug 05 '20 at 13:31
  • @WAQ, so the data ia 3.88GB, that makes more sense. What is the difficulty in loading that in the new (2012) instance? It isn't that much, really. – HoneyBadger Aug 05 '20 at 13:41
  • @HoneyBadger As soon as i double click on the .sql file to load it in sql server management studio, I get the exception `System.OutOfMemory` Exception was thrown. – WAQ Aug 05 '20 at 13:45
  • 1
    SSMS is a 32bit application, you'll never be able to open a >2GB file in it. You'd be better off using `sqlcmd` or something to run the batch(es). – Thom A Aug 05 '20 at 13:50
  • Either use something else to execute those files as Larnu suggests, or extract the data to flat file and bulk insert it. – HoneyBadger Aug 05 '20 at 14:00

1 Answers1

1

Since you probably already have the schema and data scripts.

You could use VSCode with SQL Server extension to execute the scripts against a previous SQL server version. Sublime is also able to handle large files. See this question for all editors with large file support.

But only if you generated the scripts using the correct compatibility level.

Scripting advanced settings

Be aware that with the script not everything is scripted by default like triggers. So you might want to change other options.

Preben Huybrechts
  • 5,853
  • 2
  • 27
  • 63