0

I've got to make a cron job of transfering over 100 backup .tar.gz files to ftp backup server. Stuck upon combining find command

 find /home/backup -mtime -1 -mmin +59 -type f -name "*.tar.gz*"

this part works fine, and script part:

#!/bin/sh
USERNAME="user"
PASSWORD="password"
SERVER="someip"
FILE="/home/backup"
DATE="`date +%Y-%m-%d-%H.%M.%S `"
BACKUPDIR="/backup/${DATE}/"

ftp -inv $SERVER <<EOF
user $USERNAME $PASSWORD
mkdir $BACKUPDIR
cd $BACKUPDIR
mput $FILE/*.tar.gz*
quit
EOF

for this

00 12 * * * find /home/backup -mtime -1 -mmin +59 -type f -name "*.tar.gz*" -exec /root/ftp.sh {} \;

doesn't work. No scp/ssh advice please) have to do it with ftp.

Catbear
  • 3
  • 2
  • In the cronjob, you appear to use `find` to call the script once for each `tar.gz` file, yet inside the script you use `mput` and wildcards like you think the script is transferring lots of files... – Mark Setchell Apr 02 '18 at 11:46
  • Also, you don't set a PATH in your script, so you are kind of hoping that it will somehow know where to find `date`, and `ftp`. – Mark Setchell Apr 02 '18 at 11:47
  • Thank for your comment, Mark. Shall try to change script that way. No trouble with `date` or `ftp` noticed, probably uses system variables. – Catbear Apr 02 '18 at 11:56
  • I suggest to replace `mput $FILE/*.tar.gz*` with `mput $1`. – Cyrus Apr 02 '18 at 12:28
  • Output looks like this `/home/backup/20180402_122053_full.tar.gz.29 /home/backup/20180402_122053_full.tar.gz.94 /home/backup/20180402_122053_full.tar.gz.82 /home/backup/20180402_122053_full.tar.gz.3 /home/backup/20180402_122053_full.tar.gz.63` – Catbear Apr 02 '18 at 12:31
  • Also I tried to replace FILE variable with find output `FILE="$(find /home/backup -mtime -1 -mmin +59 -type f -name "*.tar.gz*")"` but now it transfers only 1 file – Catbear Apr 02 '18 at 12:34
  • Replacing `mput $FILE` with `mput $1` also didn't help, ftp response was `250 CWD successful. "/2018-04-02" is current directory. (local-files) local: quit remote: quit local: quit: No such file or directory` – Catbear Apr 02 '18 at 12:45
  • I guess something about paths is wrong. – Catbear Apr 02 '18 at 12:51

2 Answers2

0

I advise you to make the crontab command smaller. Not that it shouldn't working your way, but it will be easier to understand what is happening.

00 12 * * * sh /root/ftpjob.sh

and

#!/bin/sh
username="user"
password="password"
server="someip"
sourcedir="/home/backup"
date="`date +%Y-%m-%d-%H.%M.%S `"
remotedir="/backup/${DATE}/"

find /home/backup -mtime -1 -mmin +59 -type f -name "*.tar.gz*" |
 while read filename ; do
    /bin/ftp -inv $server >> /tmp/ftpjob.log <<EOF
user $username $password
mkdir $remotedir
cd $remotedir
put $sourcedir/$filename
EOF
    echo "$date copied $filename" >> /tmp/ftpjob.log
done

This will work, as long as you are sure that your tar.gz filenames don't have spaces in them.

On the other hand, if you are able to do the ftp with an mput, there is no reason to do the find at all:

#!/bin/sh
username="user"
password="password"
server="someip"
sourcedir="/home/backup"
date="`date +%Y-%m-%d-%H.%M.%S `"
remotedir="/backup/${DATE}/"

/bin/ftp -inv $server >> /tmp/ftpjob.log <<EOF
user $username $password
mkdir $remotedir
cd $remotedir
mput $sourcedir/*.tar.gz.*
EOF

So, you would either use find to loop over the files, which is a good idea if there are multiple levels of directories where the tar.gz files are, or you would use mput in ftp if all the archives are always in the same directory.

Ljm Dullaart
  • 4,273
  • 2
  • 14
  • 31
  • I appreciate your answer Ljm, your suggestions were very helpful. Unfortunately I cannot get rid of `find` command for the task is to move today backups and ignore older ones. Sorry that didn't mention that before. – Catbear Apr 03 '18 at 08:47
0

Based on Ljm Dullaart answer working script looks like:

#!/bin/sh
USERNAME="user"
PASSWORD="password"
SERVER="someip"
DATE="`date +%Y-%m-%d `"
BACKUPDIR="/${DATE}/"

find . -mtime -1 -mmin +59 -type f -name "*.tar.gz*" -exec basename {} .tar.gz* \; |
  while read filename ; do
    /bin/ftp -inv $SERVER >> /tmp/ftp.log <<EOF
user $USERNAME $PASSWORD
mkdir $BACKUPDIR
cd $BACKUPDIR
put $filename
EOF
    echo "$date copied $filename" >> /tmp/ftp.log
done

yet can be done using a different loop:

#!/bin/sh
USERNAME="user"
PASSWORD="password"
SERVER="someip"
SOURCEDIR="/home/backup"
DATE="`date +%Y-%m-%d `"
BACKUPDIR="/${DATE}/"
cd /home/backup
for i in $(find /home/backup -mtime -1 -mmin +59 -type f -name "*.tar.gz*" -exec basename \{} . \;)
do
/bin/ftp -inv $SERVER >> /tmp/ftp.log <<EOF
user $USERNAME $PASSWORD
mkdir $BACKUPDIR
cd $BACKUPDIR
put $i
EOF
done

both will do for uploading backup files sorted by date via ftp.

Catbear
  • 3
  • 2