0

I've previously asked a question on dumping a database by splitting it into tables separately. Well, I manage to do it with mysqldump command but however, when I try to restore the database locally to extract some data's.. it stops with an error at certain point. A table which contains data's more than 2GB. Even I tried restoring each table to see whether it gives me a success. But it was still the same... its the same table which gives an error.

Can anyone tell me whether there's a way to repair a downloaded dump or check downloaded table by checking each??

thanks in advance.

EDIT: This is the error I got "ERROR 2006 (HY000) at line 5855099: MySQL server has gone away"

EDIT 2: When I tried this on my VPS via SSH.. it gave me this error "ERROR 1153 (08S01) at line 4215: Got a packet bigger than 'max_allowed_packet' bytes"

Community
  • 1
  • 1
AzkerM
  • 173
  • 1
  • 11
  • What is the error you are getting? – Damodaran Dec 01 '13 at 18:21
  • it was a time out error.. and when I use mysql command to dump it using command prompt, it returns with an error saying something wrong in line 600000..... – AzkerM Dec 01 '13 at 18:28
  • I feel you should try something like this http://stackoverflow.com/questions/20264324/how-to-import-1gb-sql-file-to-wamp-phpmyadmin/20264467#20264467 – Ram Sharma Dec 02 '13 at 06:47
  • @RamSharma the link you commented mostly explains about uploading a big dump. what my problem is that it dumps & stops at certain point returning errors which I've mentioned above. – AzkerM Dec 02 '13 at 07:05
  • your dumps is not imported/stopped on certain point due non responding of mysql and the most common reason for that is less buffer length, less allowed packet or memory so it's advised you to increase them to import it without any issues. – Ram Sharma Dec 02 '13 at 07:46
  • I guess I manage to do it by increasing the `max_allowed_packet` variable.. BTW, thank you so much for your help! :) – AzkerM Dec 02 '13 at 07:53

1 Answers1

2

Finally after a certain search and further checking the errors I found out that max_allowed_packet variable was causing this issue as usual. Though I'm not an expert in such field, I'm happy that I figured out. I would like to share the link which helped me to fix this issue and also how I achieved it.

This is what I did..

  1. I first logged into mysql server using mysql -u user -p & by entering the password.
  2. Then I executed following commands;

set global net_buffer_length=1000000;
set global max_allowed_packet=1000000000;

  1. Finally I left the terminal as it is & opened a new one, then I executed below command which did the magic without any interruptions..

mysql --max_allowed_packet=100M -u root -p database < dump.sql

I hope this may help any other facing such issues.

Thanks.

Community
  • 1
  • 1
AzkerM
  • 173
  • 1
  • 11