4

I imported a large mysql database using

mysql -uroot -ppassword dbName

the database has gone away during the process possibly due to timeout after a few days...
is there a way to resume it? or am I out of luck and need to delete the existing db and reimport?

user121196
  • 30,032
  • 57
  • 148
  • 198
  • I would love to know the answer to this too. I know it is possible to resume an import when using phpMyAdmin, but I'm not sure if resume will work when using the MySQL command line tool. – Richard May 10 '12 at 16:18

4 Answers4

2

It might help to use a "--ignore" option on the commandline, to "resume" the import.

The semantics is that it should ignore any already imported data and only import what's not yet there.

Here's the MYSQL documentation for the ignore option: http://dev.mysql.com/doc/refman/5.0/en/mysqlimport.html#option_mysqlimport_ignore

Nomad Siv
  • 57
  • 4
1

I'm using INSERT IGNORE INTO by stream editing (sed) my dump file like this:

nice gunzip < dumpfile.sql.gz | sed -e "s|^INSERT INTO |INSERT IGNORE INTO |g" | nice mysql -uroot -p"password" DBName
Umbrella
  • 4,733
  • 2
  • 22
  • 31
0

If you know the last insertion point on your query, split your mysqldump file to just before that point, and replace insert with insert ignore. You probably don't want to insert ignore the whole dataset, as each transaction is attempted.

Also, mysql server has gone away can also be indicative of violating max_allowed_packet size.

Peter Badida
  • 11,310
  • 10
  • 44
  • 90
GWPROD
  • 131
  • 1
  • 2
-1

The "database has gone away" is usually indicative of the server crashing, check your mysql logs /var/log/mysqld.log or if not there run;

SELECT * FROM GLOBAL_VARIABLES WHERE VARIABLE_NAME = 'LOG_ERROR';

I've never had a client disconnect, even in week long runs over the network. It looks like your connecting locally so a disconnect is very unlikely.

If you want to resume, you can do the following;

  • Check the error log to see the cause of the error and fix this first
  • Grep the dump file; grep -irH 'DROP TABLE'
  • Compare the tables restored to the grep results; note the line of the last match
  • Create a new file from the last matched db (inclusive); tail --lines=+10000 database.sql > resume.sql
  • OR; as someone else stated, use the ignore-lines option in mysqlimport

Hope this helps

harvey
  • 2,945
  • 9
  • 10
  • No, it's the connection to the database timing out, it'll be some keep alive or timeout setting somewhere. – Owl Aug 03 '16 at 10:00