81

I have a 400 MB large SQL backup file. I'm trying to import that file into MySQL database using WAMPimport, but the import was unsuccessful due to many reasons such as upload file size is too large and so on. So, I've gone ahead and made a few changes to the configuration of PHP and MySQL settings under WAMP. Changes made are

Under WAMP->MySQL->my.ini

    max_allowed_packet = 800M
    read_buffer_size = 2014K

 Under WAMP->PHP->PHP.ini

    max_input_time = 20000
    memory_limit = 128M

Is this the right way to import such a large SQL file using WAMPimport?

If so, did I make the right changes to the configuration files? What are the maximum values allowed for the above variable?

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Anil
  • 2,405
  • 6
  • 25
  • 28

14 Answers14

172

You can import large files this command line way:

mysql -h yourhostname -u username -p databasename < yoursqlfile.sql
Jk1
  • 11,233
  • 9
  • 54
  • 64
Ibu
  • 42,752
  • 13
  • 76
  • 103
  • 2
    @lbu Thanks. But i've never used mysql command line in my laptop. I always use MySQL workbench. Is there a way to launch mysql command line from workbench? – Anil Mar 07 '13 at 18:10
  • 14
    Looks like a good excuse to start learning it, you don't need to be an expert to use it, all the stuff you can do on workbench work just fine on the command line – Ibu Mar 07 '13 at 18:24
  • Thanks for your reply. And i knew that i don't need to be an expert to use it. Earlier i've used putty to connect to the server and connected to mysql shell using the command `mysql -h hostname -u username -D databasename -p`. And now, i'm trying to connect to my own laptop but couldn't. Otherwise i am good. Thanks again. – Anil Mar 07 '13 at 18:48
  • 1
    for large database more than 4 GB its taking much time. – Vipin Singh Feb 03 '18 at 04:09
  • This way may cause errors. The best way is to use `mysql> source` (see other answers) – German Khokhlov Dec 07 '18 at 11:21
44

Use this from mysql command window:

mysql> use db_name;
mysql> source backup-file.sql;
hatef
  • 5,491
  • 30
  • 43
  • 46
Wesam Na
  • 2,364
  • 26
  • 23
  • 4
    I like this solution better because it shows the output of running the script. – Jenn Apr 21 '15 at 17:36
  • You can execute an SQL script file using the source command or \. command: `mysql> source file_name` `mysql> \. file_name` [Referrence](https://dev.mysql.com/doc/refman/5.7/en/mysql-batch-commands.html) – Jorj Feb 13 '18 at 10:17
  • 1
    Will ```source``` perform well for 50GB file? – wwwwan Apr 21 '19 at 00:36
  • 1
    @wwwwan yes, I have just now successfully sourced a 2.6 Gb .sql file. It just takes a little while. You might need to increase the allowed packet size, see [this answer](https://stackoverflow.com/a/12792977/1685346). – glaux Aug 01 '19 at 07:46
43

Since you state (in a clarification comment to another person's answer) that you are using MySQL Workbench, you could try using the "sql script" option there. This will avoid the need to use the commandline (although I agree with the other answer that it's good to climb up on that horse and learn to ride).

  1. In MySQL Workbench, go to File menu, then select "open script". This is probably going to take a long time and might even crash the app since it's not made to open things that are as big as what you describe.

  2. The better way is to use the commandline. Make sure you have MySQL client installed on your machine. This means the actual MySQL (not Workbench GUI or PhpMyAdmin or anything like that). Here is a link describing the command-line tool. Once you have that downloaded and installed, open a terminal window on your machine, and you have two choices for slurping data from your file system (such as in a backup file) up into your target database. One is to use 'source' command and the other is to use the < redirection operator.

Option 1: from the directory where your backup file is:

$mysql -u username -p -h hostname databasename < backupfile.sql

Option 2: from the directory where your backup file is:

$mysql -u username -p -h hostname
[enter your password]
> use databasename;
> source backupfile.sql

Obviously, both of these options require that you have a backup file that is SQL.

Henry J
  • 344
  • 2
  • 4
  • 14
Megan Squire
  • 1,001
  • 7
  • 18
  • 9
    The "source" method worked for me when the "regular" command line method didn't. It also has the benefit of showing the execution activity in real time. – Jason Swett Oct 21 '14 at 16:33
  • 2
    ya source method is work for me to at least its showing the progress of query. but taking much time . Is any method to reduce upload time. – Vipin Singh Feb 03 '18 at 04:21
7

You can make the import command from any SQL query browser using the following command:

source C:\path\to\file\dump.sql

Run the above code in the query window ... and wait a while :)

This is more or less equal to the previously stated command line version. The main difference is that it is executed from within MySQL instead. For those using non-standard encodings, this is also safer than using the command line version because of less risk of encoding failures.

ahdaniels
  • 1,050
  • 1
  • 12
  • 25
Magnus Lindgren
  • 287
  • 1
  • 7
  • 1
    Reference, for anyone who was curious: http://dev.mysql.com/doc/refman/5.0/en/mysql-batch-commands.html – Michael Sep 11 '13 at 14:07
4

Had a similar problem, but in Windows. I was trying to figure out how to open a large MySQL SQL file in Windows, and these are the steps I had to take:

  • Go to the download website (http://dev.mysql.com/downloads/).

  • Download the MySQL Community Server and install it (select the developer or full install, so it will install client and server tools).

  • Open the MySQL command-line client from the Start menu.

  • Enter your password used in install.

  • In the prompt, mysql>, enter:

      CREATE DATABASE database_name;
    
      USE database_name;
    
      SOURCE myfile.sql
    

That should import your large file.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
live-love
  • 48,840
  • 22
  • 240
  • 204
3

I believe the easiest way is to upload the file using MYSQL command line.

using the command from the terminal to access MySQL command line and run source

    mysql --host=hostname -uuser -ppassword
    source filename.sql 

or directly from the terminal

   mysql --host=hostname -uuser -ppassword < filename.sql

at the prompt.

Sean Duggan
  • 1,105
  • 2
  • 18
  • 48
dave
  • 41
  • 3
3

We have experienced the same issue when moving the sql server in-house.

A good solution that we ended up using is splitting the sql file into chunks. There are several ways to do that. Use

http://www.ozerov.de/bigdump/ seems good (but never used it)

http://www.rusiczki.net/2007/01/24/sql-dump-file-splitter/ used it and it was very useful to get structure out of the mess and you can take it from there.

Hope this helps :)

Ali Abdulla
  • 336
  • 2
  • 8
3

Three things you have to do, if you are doing it locally:

In php.ini or php.cfg of your PHP installation

post_max_size=500M

upload_max_filesize=500M

memory_limit=900M

Or set other values. Restart Apache.

Or

Use the PHP big dump tool. It’s best ever I have seen. It’s free and open source.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Shiv Singh
  • 6,939
  • 3
  • 40
  • 50
3

If you are using the source command on Windows remember to use f:/myfolder/mysubfolder/file.sql and not f:\myfolder\mysubfolder\file.sql

Phiter
  • 14,570
  • 14
  • 50
  • 84
angus
  • 31
  • 1
2

The question is a few months old but for other people looking --

A simpler way to import a large file is to make a sub directory 'upload' in your folder c:/wamp/apps/phpmyadmin3.5.2 and edit this line in the config.inc.php file in the same directory to include the folder name $cfg['UploadDir'] = 'upload';

Then place the incoming .sql file in the folder /upload.

Working from inside the phpmyadmin console, go to the new database and import. You will now see an additional option to upload files from that folder. Chose the correct file and be a little patient. It works.

If you still get a time out error try adding $cfg['ExecTimeLimit'] = 0; to the same config.inc.php file.

I have had difficulty importing an .sql file where the user name was root and the password differed from my the root password on my new server. I simply took off the password before I exported the .sql file and the import worked smoothly.

Community
  • 1
  • 1
jobucks
  • 69
  • 7
1

On item 1.16 of phpMyAdmin they say:

1.16 I cannot upload big dump files (memory, HTTP or timeout problems).


Starting with version 2.7.0, the import engine has been re–written and these problems should not occur. If possible, upgrade your phpMyAdmin to the latest version to take advantage of the new import features.

The first things to check (or ask your host provider to check) are the values of upload_max_filesize, memory_limit and post_max_size in the php.ini configuration file. All of these three settings limit the maximum size of data that can be submitted and handled by PHP. One user also said that post_max_size and memory_limit need to be larger than upload_max_filesize. There exist several workarounds if your upload is too big or your hosting provider is unwilling to change the settings:

Look at the $cfg['UploadDir'] feature. This allows one to upload a file to the server via scp, ftp, or your favorite file transfer method. PhpMyAdmin is then able to import the files from the temporary directory. More information is available in the Configuration of this document.

Using a utility (such as BigDump) to split the files before uploading. We cannot support this or any third party applications, but are aware of users having success with it.

If you have shell (command line) access, use MySQL to import the files directly. You can do this by issuing the “source” command from within MySQL:

source filename.sql;
Leo
  • 580
  • 7
  • 22
1

The simplest solution is MySQL Workbench. Just copy the .sql file text to the query window of MySQL Workbench and just execute it. All renaming things will done by it.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Usman Ali Maan
  • 368
  • 2
  • 12
0

if you want to see the import progress. Install pv with brew and run this:

brew install pv
pv -p dump.sql | mysql -h localhost -u root -p db_name
orazz
  • 2,178
  • 2
  • 15
  • 16
-3

my.ini

max_allowed_packet = 800M
read_buffer_size = 2014K

PHP.ini

max_input_time = 20000
memory_limit = 128M
post_max_size=128M
  • 2
    This post is being automatically flagged as low quality because it is only code. Would you mind expanding it by adding some text to explain why this solves the problem? – gung - Reinstate Monica Jun 04 '14 at 13:57
  • Increasing the limits is no solution. Use mysql command for large sql imports. – ioleo Jun 26 '14 at 10:38