I have an issue on a server whereby on occassion automated backups from the server to a remote host fails.
Currently this leaves me with no recent backups and with a pile of .tar.gz
files taking up a large amount of space on the server.
My current process for correcting this when it happens is to manually Putty in and command line FTP these files across individually. This is time consuming and tedious.
I want to write a .sh script I can upload to the folder and tell the server to put across each .tar.gz file in the folder. I can't transfer the folder as a whole but simply each file in it, as some files are already transported correctly, etc.
I found this question which shows a script that worked for this question asker but I need to adjust parts of this script and I do not know (am not confident enough) with .sh instructions to do this, and also am wary of screwing up anything server side.
#!/bin/sh
USERNAME="user"
PASSWORD="pass"
SERVER="123.456.78.90"
DATE="`date +%Y-%m-%d`"
BACKUPDIR="/${DATE}/accounts/"
find . -type f -name "*.tar.gz" -exec basename {} .tar.gz \; |
while read filename ; do
/bin/ftp -inv $SERVER >> /tmp/ftp_backup.log <<EOF
user $USERNAME $PASSWORD
cd $BACKUPDIR
binary
put $filename
EOF
echo "$date copied $filename" >> /tmp/ftp_backup.log
done
My intention is to make this script that I can upload it in to the server folder in question and then run the script (after chmod
ing it) in the folder to move the .tar.gz
files - one at a time - FTP'd across to the backup directory (/<date>/accounts/
) and finishing once they're all moved.
(Then I would delete the server-side .tar.gz
files and the .sh
script above.)
There are ~60 files up to 15Gb in size. Filenames do not contain spaces.
Filepath structures:
Serve side:
/backupsfolder/2018-07-11/filename1.tar.gz
/backupsfolder/2018-07-11/filename2.tar.gz
/backupsfolder/2018-07-11/backupscript.sh //my script above
/backupsfolder/2018-07-11/master.meta //other files
FTP side:
/2018-07-11/accounts/filename1.tar.gz