45

I used this script for years on my VPS. And it's still working.

DBLIST=`mysql -uroot -pROOT_PASSWORD -ANe"SELECT GROUP_CONCAT(schema_name) FROM information_schema.schemata WHERE schema_name NOT IN ('information_schema','performance_schema')" | sed 's/,/ /g'`
MYSQLDUMP_OPTIONS="-uroot -pROOT_PASSWORD --single-transaction --routines --triggers"
BACKUP_DEST="/home/backup/db/"
for DB in `echo "${DBLIST}"`
do
    mysqldump ${MYSQLDUMP_OPTIONS} ${DB} | gzip > ${BACKUP_DEST}/${DB}.sql.gz &
done
wait
tar -czvf /home/backup/db2/`date +\%G-\%m-\%d`_db.tar.gz ${BACKUP_DEST}

Now I'm moving to another hosting. I 'm trying to use the same script (of course I changed ROOT_PASSWORD with the new credentials) but I don't know why I get this:

mysqldump: Got errno 32 on write
mysqldump: Got errno 32 on write
mysqldump: Got errno 32 on write
mysqldump: Got errno 32 on write
mysqldump: Got errno 32 on write
mysqldump: Got errno 32 on write
mysqldump: Got errno 32 on write
mysqldump: Got errno 32 on write
mysqldump: Got errno 32 on write
mysqldump: Got errno 32 on write
mysqldump: Got errno 32 on write
mysqldump: Got errno 32 on write
mysqldump: Got errno 32 on write
MultiformeIngegno
  • 6,959
  • 15
  • 60
  • 119

11 Answers11

46
20:47:59 0 ~] $ perror 32
OS error code  32:  Broken pipe

So errno 32 is "broken pipe". You're piping the mysqldump output to gzip , so this means gzip terminated prior to mysqldump finished. Could e.g. be because your disk is full, or gzip surpassed any max CPU time/usage your host has in place.

nos
  • 223,662
  • 58
  • 417
  • 506
  • Uhm.. disk is not full and CPU is free. :( – MultiformeIngegno Mar 10 '14 at 20:43
  • 1
    @MultiformeIngegno You are telling us you use a VPS, have you verified they don't have limits in place ? It's quite common for a VPS to limit your CPU usage, and kill processes that use too much CPU. Anyway, you now have an indication that gzip is getting killed, so you can play around with that, e.g. NOT pipe the output to gzip, but directly to a file and see how that goes. – nos Mar 10 '14 at 20:49
  • 2
    @nos Resurrecting to note that indeed a CPU limit was my issue on AWS. I solved it by `nice`ing the dump. – msanford Nov 18 '15 at 16:13
  • If the disk is not full and CPU is free, try this: create a basic text file, and run the gzip command on it, you will soon realize what's wrong (the way I figured it out having the same error message) – Yoric Oct 04 '17 at 08:07
  • This error in this scenario can definitely be a symptom of a full disk. – S3DEV Feb 24 '23 at 08:57
16

I had the same problem due to a couple of typos.

  1. I typed the name of the db user incorrectly. I had "db_user_1" when he was really "db_user1".

  2. After the pipe I forgot the > in gzip > myfile.tar.gz.

But I recommend you upgrade to MySQL 5.6+ asap, so you can stop exposing database passwords other users.

Check out this answer on StackOverflow.

7ochem
  • 2,183
  • 1
  • 34
  • 42
Buttle Butkus
  • 9,206
  • 13
  • 79
  • 120
11

Make sure the folder /home/backup/db/ (which your are using to store the backup) has write access permission (to quick check: try using chmod -R 777 on that folder and run the script to make sure).

  • +1 This was the problem in my case, folder must have 0777 CHMOD, in my case it had 0755, so it depends of mysql users right – vinsa Jun 19 '15 at 10:48
3

I was surprised I couldn't do a dump of my DB, I was able to do it the day before. Now, I was getting this error.

As nos said, the error message means Broken pipe, which means the output cannot be written to disk. In my case, my SSH user had no permission to write in the folder I was targeting in my mysqldump instruction.

You can output your dump in your /home/your_user directory to see it you still get the same error. Doing so solved my problem.

Leandro Bardelli
  • 10,561
  • 15
  • 79
  • 116
Brac
  • 458
  • 4
  • 8
3

I was using mysqldump from the CLI and trying to pipe to gzip and/or a file and getting a "permission denied" error.

Even as sudo, I was getting an error because although I was running mysqldump as sudo, the pipe was still trying to use the user account I was logged in to the shell as to write the output. In this case, my shell user account did not have permissions to write to the target directory.

To get around this, you can use the tee command in conjunction with sudo:

mysqldump --single-transaction --routines --events --triggers --add-drop-table --extended-insert -u backup -h 127.0.0.1 -p --all-databases | gzip -9 | sudo tee /var/backups/sql/all_$(date +"%Y_week_%U").sql.gz > /dev/null

The | sudo tee /var/backups/... is what lets us pipe to a directory that is only writable by root. The > /dev/null suppresses tee from dumping its output directly to the screen.

alexw
  • 8,468
  • 6
  • 54
  • 86
2

Faced with the same problem. I do not know why exactly, but if you add the utility PV concluded that all works. Maybe it depends on your shell bash/sh.

sudo apt-get install pv

PipeViewer it a very usefull utility, it allows you to visualize processes of writing to disk, for example.

Script for example

mysqldump ${MYSQLDUMP_OPTIONS} ${DB} | gzip | pv > ${BACKUP_DEST}/${DB}.sql.gz
PRIHLOP
  • 1,441
  • 1
  • 16
  • 16
1

Its so old topic, but I'm facing that problem and find that:

My file name: db_26/03.tar.gz its raising an error like above; but when I use: db.tar.gz there is no error.

So you should check your file name

7ochem
  • 2,183
  • 1
  • 34
  • 42
Muhammet Arslan
  • 975
  • 1
  • 9
  • 33
  • I'm +1-ing this because it worked for us, but I'd really love it if someone could explain WHY this fixes it. For us, the old filename works if the command was run through cron, but not in command line. – Adam Yost Aug 17 '15 at 19:37
  • You have a `/` forward slash in your filename, so it is trying to use `03.tar.gz` inside the (non existent) directory `db_26` – 7ochem Feb 11 '19 at 15:40
1

Check if the folder exist in your location, /home/backup/db/

if no, create every subfolder.

Command: mkdir /home/backup/db/

then run your command again.

  • This was also the problem for us. I was running Deployer on a new environment and it was not allowed to create folders. When I created the folder manually it went fine in subsequent runs. – Jacques Dec 05 '17 at 10:25
1

Errno 32 is "broken pipe" so any error that is happening with the pipe destination (in this case gzip) will cause errno 32. If the directory structure has changed and your ${BACKUP_DEST} no longer refers to a directory that exists this problem would occur.

I would debug this by piping something else to your gzip command or creating an uncompressed backup not involving gzip.

reor
  • 820
  • 9
  • 22
0

I was seeing this error, when piping mysqldump output to s3cmd. It was caused by using the wrong version of s3cmd. On Ubuntu Trusty and Debian Wheezy the packaged version of s3cmd command doesn't support stdin (because they have version 1.1.0).

Ryan
  • 4,594
  • 1
  • 32
  • 35
0

What helped me with this problem is

export LANG=C

prior to running mysqldump per https://github.com/netz98/n98-magerun/issues/771