33

I'm copying from one NAS to another. (Netgear ReadyNAS -> QNAP) i tried Pulling the files by running rsync on the QNAP, and that took forever, so I'm currently trying to push them from the Netgear. The code I'm using is:

rsync -avhr /sauce/folder admin@xxx.xxx.xxx.xxx:/dest/folder

i'm seeing:

sending incremental file list

and nothing after that.

File transfer is 577gb and there are a lot of files, however I'm seeing 0 network traffic on the QNAP (It fluctuates between 0kb/s to 6kb/s) so it looks like its not sending any kind of incremental file list.

all folders are created on the destination and then nothing happens after that.

Anyone have any thoughts? Or any ideas on if there is a better way to copy files from a ReadyNAS to QNAP

Dale King
  • 765
  • 1
  • 6
  • 18

5 Answers5

27

The documentation for -v says increase verbosity. If the only thing you're interested in is seeing more progress, you can chain -v together like so:

rsync -avvvhr /sauce/folder/ admin@xxx.xxx.xxx.xxx:/dest/folder/

and you should see more interesting progress.

This could tell you if your copying requirements -a are stricter than you need and thus take a lot of unnecessary processing time. For example, I attempted to use -a, which is equivalent to -rlptgoD, on over 100,000 images. Sending the incremental file list did not finish, even overnight. After changing it to

rsync -rtvvv /sauce/folder/ admin@xxx.xxx.xxx.xxx:/dest/folder/

sending the incremental file list became much faster, being able to see file transfers within 15 minutes

I'll Eat My Hat
  • 1,150
  • 11
  • 20
10

After leaving it over night and it doing nothing, i came in and tried again.

The code that worked appended a '*' to the end of the sauce folder. so this was what worked:

rsync -avhr /sauce/folder/* admin@xxx.xxx.xxx.xxx:/dest/folder

If anyone else has troubles - give this a shot.

Benjamin Loison
  • 3,782
  • 4
  • 16
  • 33
Dale King
  • 765
  • 1
  • 6
  • 18
  • OMG I was having the same issue. I couldn't work out why it was just sitting there doing nothing. Sure enough, changed it from /mnt/websites to /mnt/websites/* and it works. ARGH! Thanks for sharing your solution :) – Andrew Newby Oct 17 '20 at 06:03
  • On the money for me too! so I guess /sauce/folder first checks whether the (presumably large) folder as a whole needs updating while /sauce/folder/* directly skips to the files within that folder. – mugen Feb 23 '21 at 15:27
  • I really want to know how this works. Does it just skip hidden directories? – ki9 Aug 18 '21 at 23:07
2

My encounter with this was a large file that was incomplete but considered "finished transfer". I deleted the large (incomplete) file on the remote side and did another sync, which appears to have resolved the issue.

SrJoven
  • 196
  • 1
  • 4
1

I am using QNAP 1 as production system and QNAP 2 as a backup server. On QNAP 1, I use the following script as cronjob to copy files in regular intervals to the backup-QNAP. Maybe you could try this:

DATUM=`date '+%Y-%m-%d'`;
MAILFILE="/tmp/rsync_svn.txt"
EMAIL="my.mail@mail.com"

echo "Subject: SVN Sync" > $MAILFILE
echo "From: $EMAIL" >> $MAILFILE
echo "To: $EMAIL" >> $MAILFILE
echo "" >> $MAILFILE
echo "-----------------------------------" >> $MAILFILE

   rsync -e ssh -av /share/MD0_DATA/subversion 192.168.2.201:/share/HDA_DATA/subversion_backup >> $MAILFILE 2>&1

echo "-----------------------------------" >> $MAILFILE
cat $MAILFILE | sendmail -t
Ben
  • 1,579
  • 4
  • 20
  • 34
0

I encountered the same thing and determined that it was because rsync was attempting to calculate checksums for comparison, which is very slow. By default rsync uses file size, creation time, and some other attributes to check if two files are identical.

To avoid this, either omit -c / --checksum or explicitly disable checksum checking with the appropriate flag.

This will be a problem with large files or large numbers of files, so it may look like an issue with the file list, but most often is not.

Tronathan
  • 6,473
  • 5
  • 22
  • 24