1

We have a MySQL 5.0 database in a Ubuntu Server. There are about 242 tables in that database and a daily backup is running using the below command in crontab.

mysqldump -u root --password=<passwd> 'gss-app' table1 table2 .. tableN --skip-triggers --skip-add-drop-table --skip-lock-tables --compact --no-create-info --quick | bzip2 -c > /var/backups/gss-app.sql.bz2

Last week we had an issue that a customer deleted few files and I tried to restore from the mysqldump but I couldn't find the specific table in the dump file. I see the backup contains 206 tables out of 242 tables. First I didn't realize that the dump doesn't have the particular table and I have posted a question in stackoverflow.

So yesterday, I changed my crontab entry to:

mysqldump -u root --password=<passwd> gss-app | bzip2 -c > /var/backups/gss-app.sql.bz2

I extracted the backup file once the backup is completed and ran the below command to see the tables that are backed up.

grep -n 'Table structure' gss-app.sql - Now also the dump file contains only 206 tables out of 242 and I had to run a new mysqldump job to take the backup of the remaining 36 tables.

So, I would like to ask if there is any limit on the number of tables that are backed up using mysqldump command?

Community
  • 1
  • 1
Manny
  • 1,407
  • 2
  • 11
  • 15
  • Did you notice any patterns in the mysqldump output such as only the first 206 tables were dumped alphabetically? Do the remaining 36 tables have different engine (e.g. 206 tables were InnoDB and 36 were MyISAM)? Just trying to see if there was any pattern to the 36 tables that were left out. – zedfoxus May 12 '15 at 02:49
  • @BK435, the same command I mentioned above with the remaining 36 tables included after the database name. – Manny May 12 '15 at 04:08
  • @zfus, Yes the 206 tables were dumped alphabetically, and most of the tables in the database are using InnoDB engine. Only 26 tables are using MyISAM. – Manny May 12 '15 at 04:11
  • Yes, this is a production database. This syntax is being used for a long time(before i join the company recently). But, as I mentioned in my question the same happened when I ran mysqldump command without any options. The dump doesn't contain all the tables. – Manny May 12 '15 at 04:52
  • I am stumped. As a workaround, which may become your backup tool of choice, I propose using the freely available XtraBackup from Percona and checking its performance vs. mysqldump. – zedfoxus May 12 '15 at 05:22
  • Are the tables being extracted sequentially and then one or two are skipped totaling 36? or is it the last 36, middle 36 or first 36? Since this is production are these tables that are often being used in queries? Can you do a dump with --no-data and no other options and see if all tables get dumped – BK435 May 13 '15 at 18:53
  • Yes, I have taken a dump using --no-data and found all the 242 table structures are backed up, and I used it restore one of the tables yesterday. I have added a new entry to the cron file to run a mysqldump command mentioning the remaining 36 tables. – Manny May 14 '15 at 01:48
  • The new cronjob stores the remaining 36 tables into a separate file named as `gss-app-remaining.sql.bz2`. I also added one more cron job to take the dbstructure using --no-data. So, when we need to restore a table in future, I'll need to restore the dbstructure dump first and use one of the dumps depends on the table name. – Manny May 14 '15 at 01:55
  • @BK435, The last 36 tables were not taken in the dump. – Manny May 14 '15 at 02:30
  • 1
    Instead u should take first a dump with no data to get all the tables and then append the data in a second dump...that way it's all in one cronjob n sql file and u don't have to restore from 2 separate dumps – BK435 May 14 '15 at 03:34
  • Thanks BK435. Just want to make myself clear with your comment, could you please provide me a sample command on appending the dump file.? – Manny May 15 '15 at 01:05
  • 1
    mysqldump --skip-triggers --skip-lock-tables --no-data -u$youruser -p$yourpassword -h $yourhost > $dumpfile mysqldump --no-create-info --skip-triggers --skip-lock-tables $ignore_tables -u$youruser -p$yourpass -h $yourhost >> $dumpfile – BK435 May 18 '15 at 23:33
  • >> appends data to an already existing file... – BK435 May 18 '15 at 23:34

0 Answers0