35

I am using a sql server 2012 where i need to run a script of insert statement. But the size of file is 2.7 gb. when I am going to open the file in sql editor it shows me the error SystemOutOfMemory. Notepad is also not able to open any suggestion how to open it.

BIDeveloper
  • 767
  • 3
  • 14
  • 28

19 Answers19

38

Try to open the file in the browser and then copy the needed lines in the editor. That worked for me.

Gabriel
  • 381
  • 1
  • 4
  • 3
32

EmEditor is the best solution for you.

I faced a similar situation where I had to edit a 3GB .sql file. After long researches and multiple "Not Responding" windows, EmEditor did the job.

Tried many editors nothing worked, all crashed or hung(on windows).

For a one time or for short duration you could use a trial version. For a longer/permanent usage you will have to buy the premium.

Below is the download link.

https://www.emeditor.com/#download

Mashmoom
  • 487
  • 4
  • 10
11

I had the same problem today with mega-gigabyte sql files so wrote a command line utility to separate them into smaller files to process.

splitfile.exe -file "largefile.sql" -batch 5000

http://www.esensible.com/our-products/utilities/splitfile-split-large-text-files/

Source code is there as well (Visual Studio 2015)

pfeds
  • 2,183
  • 4
  • 32
  • 48
  • This solution creates more problem later for importing to the database. It breaks the file in many odd places. will have to be careful. Thank you. – Mashmoom Feb 18 '20 at 16:56
  • 1
    It seems like that site is currently not working, but the exe and source code can still be downloaded http://www.esensible.com/wp-content/uploads/2017/03/splitfile.zip and http://www.esensible.com/wp-content/uploads/2017/03/splitfile-source-code.zip See here for details: https://web.archive.org/web/20200227131607/http://www.esensible.com/our-products/utilities/splitfile-split-large-text-files/ – Shaggie Jan 04 '23 at 18:49
10

Firstly you do something wrong. Describe how and why you got a 2.7GB file. Most likely there is an alternative solution to your problem.

You can execute a file of this size with command line utility sqlcmd.exe. MSDN sqlcmd utility

sqlcmd.exe -S servername -U login -P password -d databasename -i BigBigFile.sql -o out.txt

But it will be very very very slowly as insert will be one by one row

Also I do not see any reason to open such a large file in the editor. But if you want - I opened the file >3Gb the internal editor of file manager FarManager3.0 x64 (my config Win7 x64, 8Gb ram, i7-3770). And it was also quite slow

Perhaps we should look for other options for data transfer:

MSDN Bulk import/export

MSDB SQL Server Import and Export Wizard

MSDN BACKUP and RESTORE

AlexK
  • 9,447
  • 2
  • 23
  • 32
  • 1
    I have generate a script of table from another database because i am not allowed to take the backup of complete database and this table have many records which makes it a very huge file. I cant use cmd. I have to open this file complete the required action – BIDeveloper Jun 13 '14 at 09:16
  • @user2229749 What about Import/Export Wizard? – AlexK Jun 13 '14 at 10:09
9

https://www.emeditor.com/#download

easily opened 4GB file for me

Shai Epstein
  • 179
  • 1
  • 3
4

SQL editor can open a file upto 500 mb without very very good specs, this seems to be going something wrong, if you want to insert the data from one database to another, try to use import/export wizard, SSIS package or command line utility creating a script of table and then insert is not a good approach.

4

I opened 5.14G sql file in IntelliJ IDEA editor. It doesn't even take 30sec to load.

bindu
  • 41
  • 1
3

You can always use command line utility.

Open cmd and type:

SQLCMD -S serverName -d databaseName -i yourfile.sql -U username -P password

More info here: https://learn.microsoft.com/en-us/sql/ssms/scripting/sqlcmd-run-transact-sql-script-files?view=sql-server-2017

I've just run 7gb file with 8 mln of inserts successfully.

Ulad Melekh
  • 924
  • 3
  • 17
  • 33
  • This definitely works - slow yes but it works .. although user4725980 suggested a better method when doing the transfer yourself with access to it and not just receiving large file - easier. but your method does work - file name might need to be in quotes. Would have been nice to know how long the 8 million inserts took .. – Ken Dec 29 '21 at 01:08
2

I have a similar problem with a 6GB .sql dump that a college passed me. Still wondering how he got to that.

I was able to open the file through terminal in Linux with vi/vim.

Why dont you try to create a virtual machine with centos or some other flavour, pass the file there (through ssh and using filezilla) and open it there? Install a server version, without GUI.

I bet you'll be able of opening it there. Otherwise, a split would be something to look for.

Komodoro
  • 21
  • 2
  • hi, i got that issue resolved with BCP (Bulk Copy Program) using command line utility. However in my scenario, we dont have linux. But this is a good infomation. Thanks – BIDeveloper Jan 29 '18 at 07:04
  • @PiyushJain Were you able to open the _.sql_ with that program or just to copy it around? I got a 16GB Ram vm with ubuntu server and mysql + phpmyadmin and toad client to check the database. Not even toad could import such file. So i left the vm running the import of the database to mysql (or more accurate, mariadb) during 3 days. Yeah it is a slow metal. But still cool for tests. An HP Proliant DL380 G5 running esxi 6.0.2. I ftp'ed the file into the server if you are wondering. – Komodoro Feb 02 '18 at 15:49
2

I got the same problem, but I solved it.

I had a 5 GB .sql file. In that file, I had to edit due to a syntax error. I tried with several editors but it didn't work.

Then after I tried with Visual Studio Code Editor. And it works.

  • So first download VSC from https://code.visualstudio.com/download.
  • After installing VSC you have to upload your file. If the file size is larger than its original upload file size.
  • Then after the VSC suggests that you have to increase the maximum file size in MB.
  • So You have to make the most of it as you want. (I suggest to you. Increase file size - three times greater than as you want).
  • Then you have to restart VSC. After that, you can open your file.
  • And I hope the file will be opened.
tripleee
  • 175,061
  • 34
  • 275
  • 318
vaibhav
  • 199
  • 1
  • 3
  • 1
    This appears to duplicate [an existing answer](/a/56026950/874188), though you have more details. Perhaps a better approach going forward would be to suggest an edit to the old answer to add more details. – tripleee Sep 23 '19 at 06:21
1

For cross platform applications, I find that VSCode is actually the best here. If it notices that it's a large file, it will automatically disable the syntax lookup and all the nice features, but leaves with with at least basic search and replace, etc. I use this for files about 1-3GB in size regularly for Wordpress because sometimes the permalinks don't update when changing domain names for dev/staging sites, etc.

If you happen to have Linux or Git Bash installed on Windows, then you'll have access to vim, which is my recommendation for anything 4GB+. I've had to modify lines manually in a 6GB+ MySQL file and it works fine. You'll think it's not working because it may take about a minute to read from disk into RAM, but it'll do it. You'll want to be efficient about the number of times you save, because it will write a full 6GB back to disk, and navigation will also require the use of search, line numbers, etc, but it works.

Aaron Chamberlain
  • 653
  • 2
  • 10
  • 26
1

I had 8 gb sql file and every text editor or sql servers I've tried either crashed or didn't allow it to open at all.

I was trying different programs on my computer and pycharm opened it with no effort at all, tho it has crashed once.

otuva
  • 81
  • 5
1

Sublime Text worked really well for me on a mac. I tried many, PHPstorm worked great, but in read-only mode only. Sublime Text actually opens the full file to edit, but it takes awhile to load. There's a progress bar while it's loading though.

TheTC
  • 677
  • 9
  • 19
0

Large mysql dump files can be opened by the gVim application.

Cristik
  • 30,989
  • 25
  • 91
  • 127
0

010 Editor can handle files of any size. I just used it to edit a 3GB file that Chrome, TextPad and everything else crashed on.

Jean-François Corbett
  • 37,420
  • 30
  • 139
  • 188
Dante S
  • 9
  • 1
0

Why not simply import the file into SQL Server and view it in Enterprise Manager? You can directly import it, or use PowerShell, but the latter will probably run overnight (it is easier for little files, especially a lot of them). Once it is in there you can do whatever you want with ease and the file size will be no issue.

Skip
  • 23
  • 4
0

Using a large file editor such is ultra edit is an easy solution for editing, then running it from CMD with SQLCMD utility.

kroov
  • 11
  • 5
0

I got the same issue while opening a 3GB SQL file. So I used IntelliJ to open the file and It worked for me. I'm not suggesting downloading and installing IntelliJ to open a large file. But this will be the easiest solution for the large community of developers or IDE users in their day-to-day life.

0

You can use a command-line tool such as less or more to view the file. Open the Command Prompt, PowerShell or Git Bash and navigate to the directory where the .sql file is located. Then type:

more filename.sql

or

less filename.sql
Pedram
  • 921
  • 1
  • 10
  • 17
Sajal
  • 1
  • 1
  • 1