5

I have a big SQL file that does not fit into memory and needs to be executed against Microsoft SQL Server 2008. It seems that the sqlcmd.exe tool always loads it into memory first which is impossible in this case. Any ideas?

Unfortunately, I can't split the script because it is generated by Red Gate's excellent SQL Data Compare. The entire script is one big transaction and I want to leave it that way. I had never thought that having a gigantic script is unusual because having a lot of data is common in the database world. The script is 3gb in size.

GusP
  • 2,454
  • 2
  • 23
  • 32
  • 2
    SQL file does not fit into memory? Really??? – Rockcoder Jun 04 '09 at 13:48
  • 1
    How big is the file ?!?!?!?!?!? – OneSHOT Jun 04 '09 at 13:52
  • 4
    Just to add to all of the astonished comments above... perhaps if you describe why you're running such a large file people can suggest an alternative. For example, if it's so large because you're manually inserting millions of rows from another database then perhaps an ETL process would be better than extracting it all to a script and running. – Tom H Jun 04 '09 at 13:58
  • 2
    If the file is data (i.e. lots of INSERT statements), then using a SQL file is probably the wrong approach, better to use SSIS with a CSV file or similar. – Richard Jun 04 '09 at 14:01
  • @Mr. Brownstone: it can happen with scripted database installs that have to do a lot of inserts. – Joel Coehoorn Jun 04 '09 at 14:06
  • unfortunately, i cannot split the script because it is generated by Redgates excellent SQL Data Compare. the entire script is one big transaction and i want to leave it that way. i had never thought that having a gigantic script is unusual because having much data is common in the database world. the script is 3gb in size. –  Jun 04 '09 at 14:17
  • This sounds like a question to ask RedGate. Maybe even a bug report. Or maybe just time to upgrade your machine to a 64-bit processor with 16 or 32GB of memory. – John Saunders Jun 04 '09 at 14:27
  • 1
    If you're just moving a massive amount of data from one database/table to another, then RG's Data Compare is the wrong tool. If you're syncronizing two tables (two tables contain customers and you need to combine them, or one table contains some data and you need to add non-matching rows from another table, but you can't delete what's already there), this is where Data Compare excels. If you're just moving data to an empty table, I'd check out other methods, like SSIS, which can still be done inside a transaction. – SqlRyan Jun 04 '09 at 19:11

8 Answers8

3

RedGate's SQL Compare has an option to execute the statements directly, instead of generating a SQL script and executing it later. Is there a reason this wouldn't work - in other words, is there a reason you require a SQL script and can't use the application's "synchronize now" functionality?

SqlRyan
  • 33,116
  • 33
  • 114
  • 199
  • That's a good point (Redgate makes some awesome products) but he may not have update access in his environment. That's the position I'm in. I need to generate scripts and pass them on to DBA's to run. – wcm Jun 04 '09 at 14:47
  • Touche. In that case, is it necessary to update every single table in a single transaction? Can you instead break the operation into 3-4 different scripts, meaning run the Red Gate tool on only 1/4 of your tables each time (picking tables so you balance the size of each script)? Though this isn't a pure transaction for your entire update, each piece is a transaction, so your data won't be left in a "damaged" state if a script fails, and you could even request that DBAs run them simultaniously during a maintenance window, so users won't see partially updated data. – SqlRyan Jun 04 '09 at 15:10
  • You could make the argument that the DBA could run SQL Data compare for him. The question for me is whether SQL Data Compare will run into the same issue. I mean, isn't Redgate just running the script for you in the same way that Management Studio runs the script. I'm not saying don't try it. It's a good idea – wcm Jun 04 '09 at 15:41
  • wcm is just right: i cannot update the database from the machine SQL Data Compare runs and i doubt that will change very soon. Is it really that hard to stream a script instead of executing it at once? Btw, the script is for one gigantic table, not for many. –  Jun 04 '09 at 16:00
  • We own a copy of Red Gate Data Compare, so I just did a compare and had the tool syncronize the databases while running SQL Profiler, and it looks like the tool actually opens the transaction, runs the statements in batches, and the commits at the end. Since it's submitted smaller batches at a time, it should run just fine on a DBA's workstation. Is this a viable option? – SqlRyan Jun 04 '09 at 16:19
  • Are the batches for each table being updated? If so, since he is only doing one table this won't work for him. – wcm Jun 04 '09 at 17:38
2

I ran into this problem a few months ago. I generate sync scripts with SQLDataCompare on a weekly and monthly basis for several of our catalog databases and they are routinely larger than 500MB. My solution was writing a VBscript that chops the update script into 50 to 1000 command batches. The problem with this approach is losing the ability to roll back all changes if something breaks halfway into your database update.

1

What/who created the SQL script? Get whatever created the file to split the script up into logic chunks, by either transaction or statement (depending on how the file is structured). If the source can't do this, then whip up a script to split the file up logically.

Welbog
  • 59,154
  • 9
  • 110
  • 123
1

If it is that big, the script is either too complex or is repetitive. In either case, as others have suggested, the only sensible thing is to break it down into manageable chunks.

Is this a one-off exercise or a regular event?

CJM
  • 11,908
  • 20
  • 77
  • 115
1

I've had this problem before where the script had an enormous XML String that was being used with OpenXML. The actual SQL was rather minimal, updating some values in a table.

I ended up inserting the data (in chunks) into a temporary table until all the info that was in the XML was stored. Then I ran my update statement.

Added later after more data got posted:

You may want to select large chunks in the tool and have SQL Data compare generate the scripts in chunks. That way you get the transactions. You can select large sections by simply highlighting a range and hitting the space bar.

wcm
  • 9,045
  • 7
  • 39
  • 64
0

Use sqlcmd command

Example:

sqlcmd  -S myServer\instanceName -i C:\myScript.sql
0

The way I understand it, SSMS is 32-bit so it can't load a script over 1.5-2 GB. You can run the script in SQLCMD.exe, but then you may run into problems because of the transaction size -- SqlCmd will keep a whole transaction in memory. So what you can do then in SQL Data Compare is go into the options and use "split transactions", which may help.

SQL Data Compare will also do partial updates to BLOBs, which will solve the "enormous BLOB" issue.

This of course if based on the latest version of Data Compare. Some versions may not have these features.

Another option may be to use SQL Compare to create a schema script to a folder, then use SDC to sync the data into that folder. Then you have a file for each table rather than one massive file.

Hope this helps.

Wonko
  • 331
  • 1
  • 5
0

1-800-redgate-support.....

or

  • break up transaction script into smaller files
  • set database in single user mode
  • fullbackup of database
  • run each smaller script file; if there is a failure: restore backup, fix script, try again
  • back out of single user mode, all done
KM.
  • 101,727
  • 34
  • 178
  • 212