I've been working on this for days, pretty frustrated.
Have a Magento database, about 1Gb with 3MM records - need to make a backup and import it onto my local machine. Local machine is running WAMP on a brand new gaming rig specs with 16 Gb RAM). Exported the db fine using PHPMyAdmin into a .sql file.
Saw BigDump was highly recommended to import a large db. Also find a link that says it's recommended for the syntax to include column names in every INSERT statement
Done. ( http://www.atomicsmash.co.uk/blog/import-large-sql-databases/ )
Start importing. Hours go by (around 3-4). Get an error: Page unavailable, or wrong url!
More searching, try suggestions ( mostly here: http://www.sitehostingtalk.com/f16/bigdump-error-page-unavailable-wrong-url-56939/ ) to drop the $linespersession
to 500 and add a $delaypersession
of 300. Run again, more hours, same error.
I then re-exported the db into two .sql dumps (one that held all the large tables with over 100K records), repeat, same error. So I quit using Bigdump.
Next up was the command line! Using Console2 I ran source mydump.sql
. 30 hours go by. Then an error:
ERROR 1231 (42000): Variable 'character_set_client' can't be set to the value of 'NULL'
ERROR 1231 (42000): Variable 'collation_connection' can't be set to the value of 'NULL'
More searching, really varied explanations. I tried with the split files from before - run it again, same error.
I can't figure out what would cause both of these errors. I know that I got the same error on two different exports. I know there are a few tables that are between 1-300,000 rows. I also don't think 30 hours is normal (on a screaming fast machine) for an import of only a 1Gb but I could be wrong.
What other options should I try? Is it the format of the export? Should it be compressed or not? Is there a faster way of importing? Any way of making this go faster?
Thanks!
EDIT
Thanks to some searching and @Bill Karwin suggestion here's where I'm at:
- Grabbed a new mysqldump using ssh and downloaded it.
- Imported the database 10 different times. Each time was MUCH faster (5-10 mins) so that fixed the ridiculous import time.
- used command line,
>source dump.sql
- used command line,
- However, each import from that same dump.sql file has a different number of records. Of the 3 million records they differ by between 600 and 200,000 records. One of the imports has 12,000 MORE records than the original. I've tried with and without setting the
foreign_key_checks = 0;
I tried running the same query multiple times with exactly the same settings. Every time the number of rows are different.
I'm also getting these errors now:
ERROR 1231 (42000): Variable 'time_zone' can't be set to the value of 'NULL'
ERROR 1231 (42000): Variable 'sql_mode' can't be set to the value of 'NULL'
ERROR 1231 (42000): Variable 'foreign_key_checks' can't be set to the value of 'NULL'
ERROR 1231 (42000): Variable 'unique_checks' can't be set to the value of 'NULL'
ERROR 1231 (42000): Variable 'character_set_client' can't be set to the value of 'NULL'
ERROR 1231 (42000): Variable 'collation_connection' can't be set to the value of 'NULL'
ERROR 1231 (42000): Variable 'sql_notes' can't be set to the value of 'NULL'
Doesn't seem like these are that important from what I read. There are other warnings but I can't seem to determine what they are.
Any ideas?
EDIT: Solution removed here and listed below as a separate post
References:
https://serverfault.com/questions/244725/how-to-is-mysqls-net-buffer-length-config-viewed-and-reset
http://dev.mysql.com/doc/refman/5.1/en/server-system-variables.html#sysvar_net_buffer_length
Make phpMyAdmin show exact number of records for InnoDB tables?