11

I have a SQL file/SQL string which is about 20MB. SQL server simply cannot accept this file. Is there a limit on the maximum size of the .SQL file or variable which is used to query or insert data into SQL server ?

When I say variable, it means passing a variable to SQL server through some programming language or even ETL tool.

Steam
  • 9,368
  • 27
  • 83
  • 122
  • 1
    I have used `sqlcmd` with large (~20MB) files before and never had a problem. Same with SQL Studio Management Tools. – Paul Draper Nov 17 '13 at 08:28
  • @PaulDraper - about 3 weeks ago, I used a 200Mb file with management studio and i got a weird error. My query was okay though. Is there a way to find limits ? – Steam Nov 17 '13 at 08:30
  • Well, it seems like you are doing it ;) But seriously, I have never seen published limits. Realize that this answer will depend on the external tool or library used. – Paul Draper Nov 17 '13 at 08:32
  • @PaulDraper - here is that error which never got resolved - http://stackoverflow.com/questions/19647922/sql-server-error-hresult-e-fail-has-been-returned-from-a-call-to-a-com-compone – Steam Nov 17 '13 at 08:41

9 Answers9

10

You can use SQLCMD, but I just ran into a 2GB file size limit using that command-line tool. This was even though I had a GO after every statement. I get an Incorrect syntax error once the 2GB boundary is crossed.

After some searching, I found this link: https://connect.microsoft.com/SQLServer/feedback/details/272757/sqlcmd-exe-should-not-have-a-limit-to-the-size-of-file-it-will-process

The linked page above says that every character after 2GB is ignored. That could explain my Incorrect syntax error.

DarkTygur
  • 188
  • 1
  • 8
5

Yep, I've seen this before. There is no size limit to .sql files. It's more about what kind of logic is being executed from within that .sql file. If you have a ton of quick inserts into a small table ex: INSERT INTO myTable (column1) VALUES(1) then you can run thousands of these within one .sql file whereas if you're applying heavy logic in addition to your insert/deletes then you'll have these problems. The size of the file isn't as important as what's in the file.

When we came across these in the past, we ran the .sql files from SQLCMD . Very easy to do. You could also create a streamreader in C# or vb to read the .sql file and build a query to execute.

SQLCMD: SQLCMD -S [Servername] -E -i [SQL Script]

Was that clear enough? If you post an example of what you're trying to do then I could write some sample code for you.

When I first experienced this problem, the only solution I found was to split the .sql file into smaller ones. That didn't work for our solution but SQLCMD did. We later implemented a utility that read these large files and executed them with some quick c# programming and a streamreader.

Nick H.
  • 1,616
  • 14
  • 20
  • Insert into table about 500K times. Only long insert statements with no extra logic like select, if-else etc. None of those big insert files executed. – Steam Jan 10 '14 at 00:23
  • Did you run it from SQLCMD? Was it successful? If you're just trying to look at the file, open it with notepad. If it doesn't open then there's an issue with the file. Any weird characters at the top when opening in notepad? – Nick H. Jan 10 '14 at 21:29
  • I don't want to use SQLCMD because that option requires a lot of settings to be changed on servers. Any other way ? There are no weird characters at all. – Steam Jan 10 '14 at 21:44
  • If you're limited to using SSMS only then you could break up the sql file into smaller sql files. Sorry, I hate that answer too. If you can use a C# utility, I would help you write a quick stream reader to execute the script. – Nick H. Jan 13 '14 at 14:23
4

Size of the SQL file should be limited by memory available on your PC/workstation. However, if you don't want to use osql and/or third party tool(s), there is a solution for this in the very SSMS. It's called SQLCMD Mode and it enables you to run a SQL file by referencing it, and not really opening it in editor.

Basically, all you have to do is:

  1. In your Query menu select SQLCMD Mode
  2. Look up the path to your called script (large SQL file)
  3. Open up a New Query (or use existing one) and write this code in a new line

    :r D:\PathToMyLargeFile\MyLargeFile.sql

  4. Run that (calling) script

If you need to use a variable in your called script, you have to declare it in a calling script. Then your calling script should look like this:

:setvar myVariable "My variable content"
:r D:\PathToMyLargeFile\MyLargeFile.sql

Let's say your called script uses the variable for content that should be inserted into rows. Then it should look something like this...

INSERT INTO MyTable (MyColumn)
SELECT '$(myVariable)'
OzrenTkalcecKrznaric
  • 5,535
  • 4
  • 34
  • 57
1

Pranav was kind of on the right track in referencing the Maximum Capacity Specifications for SQL Server article; however, the applicable limit to executing queries is:

Length of a string containing SQL statements (batch size)1 65,536 * Network packet size

1 Network Packet Size is the size of the tabular data stream (TDS) packets used to communicate between applications and the relational Database Engine. The default packet size is 4 KB, and is controlled by the network packet size configuration option.

Additionally, I have seen problems with large numbers of SQL statements executing in SQL Server Management Studio. (See SQL Server does not finish execution of a large batch of SQL statements for a related problem.) Try adding SET NOCOUNT ON to your SQL to prevent sending unnecessary data. Also, when doing large numbers of INSERT statements, try breaking them into batches using the GO batch separator.

JamieSee
  • 12,696
  • 2
  • 31
  • 47
  • Thanks. Is it guaranteed that GO will prevent this problem ? If so, then why ? – Steam Jan 15 '14 at 23:26
  • Yes, GO will prevent the problem. GO is not a SQL statement - it is a command to Management Studio or sqlcmd to send the statements up to the GO command to SQL Server for processing, wait for the results, then send the next set of statements up to the next GO, and so on up to the end of the file. You cannot put a GO inside a block of code (BEGIN/END) - each chunk must be a valid set of statements. – Mike Dimmick Jan 16 '14 at 17:57
1

I think your concern comes from trying to open your file in SSMS. Within SSMS, opening a 20mb file would likely be problematic -- no different than trying to open the same file in Notepad or most text editors.

For the record - for other posters - I don't think the questions has anything to do at all with SQL column, table, object, or database sizes! It's simply a problem with using the IDE.

If the file is pure data to be imported, with NO sql commands, try bulk import.

If the file is SQL commands, you're going to need an editor that can handle large files, like Vedit. http://www.vedit.com/ It won't be able to execute the sql. You must do that from the command line using sqlcmd as noted above.

Xavier J
  • 4,326
  • 1
  • 14
  • 25
0

Here are few links, 2 I hope they might be helpful for you

JB9
  • 49
  • 8
  • In what way will those links help me ? – Steam Jan 10 '14 at 21:43
  • I was just doing some research around this topic so that of sharing this with you http://www.infosys.com/microsoft/resource-center/Documents/SQLServer-FILESTREAM-BLOBs.pdf, I am not sure whether this is gonna work, but give a try – JB9 Jan 10 '14 at 22:39
0

I came through this article on MSDN which specifies "Maximum Capacity Specifications for SQL Server", going through this, I was able to find :

For Sql Server 2012, 2008 R2, 2005 :

Maximum File size (data): 16 terabytes

Maximum Bytes per varchar(max), varbinary(max), xml, text, or image column: 2^31-1 Bytes (~2048 GB)

For more details on Maximum Capacity Specifications for SQL Server, refer:

For SQL Server 2012: http://msdn.microsoft.com/en-us/library/ms143432(v=sql.110).aspx

For SQL Server 2008 R2: http://msdn.microsoft.com/en-us/library/ms143432(v=sql.105).aspx

For SQL Server 2005: http://msdn.microsoft.com/en-us/library/ms143432(v=sql.90).aspx

For Sql server 2000, I am not sure since MSDN seems to have removed related documentation .

Pranav Singh
  • 17,079
  • 30
  • 77
  • 104
  • I think this one might be relevant to my case - "Length of a string containing SQL statements" (Batch size). It says max length = 65,536 * Network packet size. Generally network packet size = 4kb. So max string size = 256mb. But, even 100, 150 and 200mb files failed for me. Maybe my packet size is small ? How do I find out ? – Steam Jan 14 '14 at 18:11
  • Packet size settings can be manipulated in webservices which sent data over network. Based on which technology you use it changes. e.g. in WCF in .NET, we use binding in web.config which can specify these settings like ` ` – Pranav Singh Jan 15 '14 at 03:12
  • How do I find out my packet size from, say a C# program ? – Steam Jan 15 '14 at 17:47
  • In c#, if you are using Windows Communication Foundation(WCF), then answers to http://stackoverflow.com/questions/966323/how-to-programmatically-modify-wcf-app-config-endpoint-address-setting http://stackoverflow.com/questions/11583882/programatically-adding-an-endpoint and msdn link http://msdn.microsoft.com/en-us/library/ff647110.aspx will help. If you are not using services then things will be difficult for you... – Pranav Singh Jan 16 '14 at 03:16
  • I want to use a C# script inside SSIS or standalone script in visual studio. – Steam Jan 16 '14 at 17:49
0

It is not clear from your question what the SQL file contains. The solution I suggest below is only applicable if the SQL file you refer to has only insert statements.

The fastest way to insert large amounts of data into SQL server is to use bulk copy functionality (BCP BCP Utility)

If you have SQL Server management studio then you should also have the bcp utlity look in C:\Program Files (x86)\Microsoft SQL Server\90\Tools\Binn (or equivalent).

If you want to use BCP utility then you would need to create a file that contains the data, this can be comma delimited. Refer to bcp documentation on what the file should look like.

Manvendra Gupta
  • 406
  • 5
  • 9
0

For Maximum Capacity Specifications for SQL Server , you can check in here. http://msdn.microsoft.com/en-us/library/ms143432(v=sql.120).aspx.

if you ask "Is there a limit on the maximum size of the .SQL file or variable which is used to query or insert data into SQL server ?" I will say yes there is a limit for each variabel.and if you want upload file with big size, i recommended you convert your file to varbinary or you can increasing the Maximum Upload Size in your sistem web. here i give some example http://msdn.microsoft.com/en-us/library/aa479405.aspx