Trying to put together my first useful Python program, with the aim of automating my website backups. I watched a tutorial on how to download a single file, but when it comes to a folder I'm less clear. I'd like to create a local backup of an entire folder from my website via FTP.
So far I have come up with this, with some help from this question:
from ftplib import FTP
import os
ftp=FTP("ftp.xxxxxxxxx.com")
ftp.login("xxxxxxxxxxx","xxxxxxxxxx") #login to FTP account
print "Successfully logged in"
ftp.cwd("public_html") #change working directory to \public_html\
filenames = ftp.nlst() #create variable to store contents of \public_html\
os.makedirs("C:\\Users\\xxxxxx\\Desktop\\Backup")#create local backup directory
os.chdir("C:\\Users\\xxxxxx\\Desktop\\Backup")#change working directory to local backup directory
#for loop to download each file individually
for a in filenames:
ftp.retrbinary("RETR " + a, file.write)
file.close()
ftp.close() #CLOSE THE FTP CONNECTION
print "FTP connection closed. Goodbye"
I'm reluctant to run it as I don't want to create a problem on my website if it's wrong. I should note that the filename & extension of the local file should exactly match that of the remote file being downloaded.
Any guidance appreciated!