280

I use RedGate SQL data compare and generated a .sql file, so I could run it on my local machine. But the problem is that the file is over 300mb, which means I can't do copy and paste because the clipboard won't be able to handle it, and when I try to open the file in SQL Server Management Studio I get an error about the file being too large.

Is there a way to run a large .sql file? The file basically contains data for two new tables.

Jim Ashworth
  • 765
  • 6
  • 17
Jack
  • 9,843
  • 23
  • 78
  • 111

13 Answers13

518

From the command prompt, start up sqlcmd:

sqlcmd -S <server> -i C:\<your file here>.sql 

Just replace <server> with the location of your SQL box and <your file here> with the name of your script. Don't forget, if you're using a SQL instance the syntax is:

sqlcmd -S <server>\instance.

Here is the list of all arguments you can pass sqlcmd:

Sqlcmd            [-U login id]          [-P password]
  [-S server]            [-H hostname]          [-E trusted connection]
  [-d use database name] [-l login timeout]     [-t query timeout] 
  [-h headers]           [-s colseparator]      [-w screen width]
  [-a packetsize]        [-e echo input]        [-I Enable Quoted Identifiers]
  [-c cmdend]            [-L[c] list servers[clean output]]
  [-q "cmdline query"]   [-Q "cmdline query" and exit] 
  [-m errorlevel]        [-V severitylevel]     [-W remove trailing spaces]
  [-u unicode output]    [-r[0|1] msgs to stderr]
  [-i inputfile]         [-o outputfile]        [-z new password]
  [-f  | i:[,o:]] [-Z new password and exit] 
  [-k[1|2] remove[replace] control characters]
  [-y variable length type display width]
  [-Y fixed length type display width]
  [-p[1] print statistics[colon format]]
  [-R use client regional setting]
  [-b On error batch abort]
  [-v var = "value"...]  [-A dedicated admin connection]
  [-X[1] disable commands, startup script, environment variables [and exit]]
  [-x disable variable substitution]
  [-? show syntax summary] 
Kevin
  • 7,162
  • 11
  • 46
  • 70
Ray Booysen
  • 28,894
  • 13
  • 84
  • 111
  • 3
    I have some insert like: `INSERT INTO table_name (text) VALUES ('some text bla bla 'bla' text')` . I have error with quotes. SO What do i have to use (which parameter) to handle this error? Thanks – Oleksandr Fentsyk Jul 15 '13 at 20:49
  • 1
    Can we keep the script on external drive ? Like E: drive ? instead of C drive ? – Ani Aug 09 '13 at 17:35
  • Yup, nothing that stops you. – Ray Booysen Aug 11 '13 at 17:07
  • 12
    This worked for me, but I had to get rid of the -o at the end – bunggo Dec 11 '14 at 21:59
  • I also removed the -o from the end and ran an 800mb file. Also I split the file into schema (which opens in management studio and can be edited) and data, which was imported via the command line, because running the entire file in on a box with no C:\ drive dumped all my tabled into `master` – Gareth Davidson Apr 14 '15 at 16:47
  • My sql server was using the default instance (MSSQLSERVER) so the -S \instance was throwing me off. If this is your case you can just run -S localhost (no instance and w/e your server name actually is). – Robert Swilley Jun 17 '16 at 14:53
  • 1
    FYI, SQL Server 2008 R2, ran files from 400MB to 6GB; working as intended. – Mayer M Feb 16 '18 at 14:45
  • 3
    it failed to run 75G sql file. – Aladdin Feb 23 '19 at 11:43
  • Is there an alternative for this command for larger sql files? – Anders Lindén Apr 24 '19 at 14:46
  • Possibly larger SQL files suggest that having raw SQL as your input is the wrong choice. If the official tooling can't cope with files that large, you could either split the files yourself or move data in another medium, like a SQL backup. – Ray Booysen May 17 '19 at 02:06
  • Failed with 4,44 GB... – SiL3NC3 Feb 28 '22 at 13:44
  • Iconize cmd prompt to import faster! – izio Mar 06 '22 at 12:54
84

I had exactly the same issue and had been struggling for a while then finally found the solution which is to set -a parameter to the sqlcmd in order to change its default packet size:

sqlcmd -S [servername] -d [databasename] -i [scriptfilename] -a 32767
Neil Thompson
  • 6,356
  • 2
  • 30
  • 53
Takuro
  • 841
  • 6
  • 2
  • 6
    +1 for adding the -d option in the example. I used the "USE [DBNAME]" in my sql file. This worked as well. Not sure which is the better or preferred method – Shaakir Jan 26 '16 at 15:14
  • 3
    Thanks. I used the following command as user name and password required to connect to the database. `sqlcmd -S -U -P -d -i ` – Hriju Feb 25 '16 at 07:21
  • 1
    Which problem you did solve by setting the packet size? Microsoft says: "A larger packet size can enhance performance..." (https://learn.microsoft.com/en-us/sql/tools/sqlcmd-utility) – The incredible Jan Jun 27 '17 at 08:26
  • @TheincredibleJan I think he is referring to this: https://stackoverflow.com/q/55881713/345659 – JumpingJezza Sep 25 '20 at 02:16
20

You can use this tool as well. It is really useful.

BigSqlRunner

NB: Broken link, so have updated it.

Paul
  • 4,160
  • 3
  • 30
  • 56
Yigit Yuksel
  • 1,060
  • 2
  • 18
  • 35
  • I thought it was useful as well, until I ran it with a 400 MB SQL file (full of INSERT statements). After running for 9 hours, nothing had been added to the database, but BigSqlRunner was consuming between 3 GB and 6 GB of memory. – Martijn Oct 24 '17 at 07:27
  • VERY useful. Usually I manually split the file (various tools) into chunks, then manually edit them so the sql statements are "whole", then execute via a batch script. This makes it SO easy. 2.3gb .sql file, easy. – Barry Apr 22 '18 at 01:39
17
  1. Take command prompt with administrator privilege

  2. Change directory to where the .sql file stored

  3. Execute the following command

    sqlcmd -S 'your server name' -U 'user name of server' -P 'password of server' -d 'db name'-i script.sql

Community
  • 1
  • 1
Syam Kumar
  • 343
  • 5
  • 16
8

I am using MSSQL Express 2014 and none of the solutions worked for me. They all just crashed SQL. As I only needed to run a one off script with many simple insert statements I got around it by writing a little console app as a very last resort:

class Program
{
    static void Main(string[] args)
    {
        RunScript();
    }

    private static void RunScript()
    {
        My_DataEntities db = new My_DataEntities();

        string line;

        System.IO.StreamReader file =
           new System.IO.StreamReader("c:\\ukpostcodesmssql.sql");
        while ((line = file.ReadLine()) != null)
        {
            db.Database.ExecuteSqlCommand(line);
        }

        file.Close();
    }
}
4

Run it at the command line with osql, see here:

http://metrix.fcny.org/wiki/display/dev/How+to+execute+a+.SQL+script+using+OSQL

BobbyShaftoe
  • 28,337
  • 7
  • 52
  • 74
3

Hope this help you!

sqlcmd -u UserName -s <ServerName\InstanceName> -i U:\<Path>\script.sql
2

I had similar problem. My file with sql script was over 150MB of size (with almost 900k of very simple INSERTs). I used solution advised by Takuro (as the answer in this question) but I still got error with message saying that there was not enough memory ("There is insufficient system memory in resource pool 'internal' to run this query").

What helped me was that I put GO command after every 50k INSERTs.

(It's not directly addressing the question (file size) but I believe it resolves problem that is indirectly connected with large size of sql script itself. In my case many insert commands)

Bronek
  • 10,722
  • 2
  • 45
  • 46
2

==> sqlcmd -S [servername] -d [databasename] -i [scriptfilename] -a 32767

I have successfully done with this command with 365mb sql file. this syntax runs in about 15 minutes. it helped me solve a problem that took me a long time to figure out

  • Your answer could be improved with additional supporting information. Please [edit] to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Oct 10 '21 at 19:13
2

Run the script file

Open a command prompt window.

In the Command Prompt window, type: sqlcmd -S <ServerName\InstanceName> -i C:\yourScript.sql

Press ENTER.

Rafiqul Islam
  • 931
  • 11
  • 14
1

Your question is quite similar to this one

You can save your file/script as .txt or .sql and run it from Sql Server Management Studio (I think the menu is Open/Query, then just run the query in the SSMS interface). You migh have to update the first line, indicating the database to be created or selected on your local machine.

If you have to do this data transfer very often, you could then go for replication. Depending on your needs, snapshot replication could be ok. If you have to synch the data between your two servers, you could go for a more complex model such as merge replication.

EDIT: I didn't notice that you had problems with SSMS linked to file size. Then you can go for command-line, as proposed by others, snapshot replication (publish on your main server, subscribe on your local one, replicate, then unsubscribe) or even backup/restore

Community
  • 1
  • 1
Philippe Grondier
  • 10,900
  • 3
  • 33
  • 72
1

The file basically contain data for two new tables.

Then you may find it simpler to just DTS (or SSIS, if this is SQL Server 2005+) the data over, if the two servers are on the same network.

If the two servers are not on the same network, you can backup the source database and restore it to a new database on the destination server. Then you can use DTS/SSIS, or even a simple INSERT INTO SELECT, to transfer the two tables to the destination database.

P Daddy
  • 28,912
  • 9
  • 68
  • 92
  • If one or two tables generate a file that big, it is safe to assume the DB is a couple of Gigs, which makes backup and restore infeasible in many cases. – Captain Kenpachi Nov 08 '12 at 11:19
0

There is probably another way for all the fellows still encountering problems importing really large SQL dumps.

What also be considered when possible: If you have access to the server you could export the database in multiple parts, like first the structure, then per table (or related objects) an export of the data in smaller pieces, instead of one big file.

When you don't have access to server and/or required to use the existing big file, you could try to split them into parts with SQLDumpSplitter: https://philiplb.de/sqldumpsplitter3/.

Then import the pieces to get a full copy of the database.

Good luck, guys.

SiL3NC3
  • 690
  • 6
  • 29