0

I have an issue on a server whereby on occassion automated backups from the server to a remote host fails.

Currently this leaves me with no recent backups and with a pile of .tar.gz files taking up a large amount of space on the server.

My current process for correcting this when it happens is to manually Putty in and command line FTP these files across individually. This is time consuming and tedious.

I want to write a .sh script I can upload to the folder and tell the server to put across each .tar.gz file in the folder. I can't transfer the folder as a whole but simply each file in it, as some files are already transported correctly, etc.

I found this question which shows a script that worked for this question asker but I need to adjust parts of this script and I do not know (am not confident enough) with .sh instructions to do this, and also am wary of screwing up anything server side.

#!/bin/sh
USERNAME="user"
PASSWORD="pass"
SERVER="123.456.78.90"
DATE="`date +%Y-%m-%d`"
BACKUPDIR="/${DATE}/accounts/"

find . -type f -name "*.tar.gz" -exec basename {} .tar.gz \; |
  while read filename ; do
    /bin/ftp -inv $SERVER >> /tmp/ftp_backup.log <<EOF
user $USERNAME $PASSWORD
cd $BACKUPDIR
binary
put $filename
EOF
    echo "$date copied $filename" >> /tmp/ftp_backup.log
done

My intention is to make this script that I can upload it in to the server folder in question and then run the script (after chmoding it) in the folder to move the .tar.gz files - one at a time - FTP'd across to the backup directory (/<date>/accounts/) and finishing once they're all moved.

(Then I would delete the server-side .tar.gz files and the .sh script above.)

There are ~60 files up to 15Gb in size. Filenames do not contain spaces.

Filepath structures:

Serve side:

/backupsfolder/2018-07-11/filename1.tar.gz
/backupsfolder/2018-07-11/filename2.tar.gz
/backupsfolder/2018-07-11/backupscript.sh //my script above
/backupsfolder/2018-07-11/master.meta //other files

FTP side:

/2018-07-11/accounts/filename1.tar.gz

What do I need to adjust on the above script to do this?

Community
  • 1
  • 1
Martin
  • 22,212
  • 11
  • 70
  • 132

1 Answers1

0

After some work I found a few issues to be careful of and fix:

1) In order to run, .sh files need to be "enabled" with chmod on the server.

chmod +x ./<filename>

2) Unix line endings; while using Notepad++ it claimed to have correct line endings saved, but the error was coming up on the server of:

/bin/sh^M: bad interpreter: No such file or directory

this was solved with:

sed -i 's/\r//'  <filepath>/<filename>

from this answer.

3) The names of the files being pushed to FTP was wrong - it was not including the .tar.gz - I hadn't realised the -exec feature was cutting off the .tar.gz

This was fixed with

-exec basename {} .tar.gz 

becomes

-exec basename {}

4) Log file output was not being set on new lines; instead being all on the same line. This was fixed with reading this anwser and using -e on the echo statements and using the \n syntax.

echo -e "$date copied $filename\n" 

Final fully working bash script for my needs:

  • 1) Save the script to the server
  • 2) Run sed -i 's/\r//' /<filepath>/<filename>
  • 3) Run chmod +x ./<filename>
  • 4) Run the file in bash.
  • 5) View results in the tmp directory specified.

The script

This script takes .tar.gz files from the current directory and saves them to the remote FTP, cycling through each file in turn.

#!/bin/sh
USERNAME="user"
PASSWORD="pass"
SERVER="123.456.78.90"
DATE="`date +%Y-%m-%d`"
BACKUPDIR="/${DATE}/accounts/"

find . -type f -name "*.tar.gz" -exec basename {} \; |
  while read filename ; do 
    ftp -inv $SERVER >> /tmp/My_ftp_backup.log <<EOF
user $USERNAME $PASSWORD
cd $BACKUPDIR
binary
put $filename
EOF
    echo -e "$date copied $filename\n" >> /tmp/My_ftp_backup.log
done
Martin
  • 22,212
  • 11
  • 70
  • 132