2

Trying to put together my first useful Python program, with the aim of automating my website backups. I watched a tutorial on how to download a single file, but when it comes to a folder I'm less clear. I'd like to create a local backup of an entire folder from my website via FTP.

So far I have come up with this, with some help from this question:

from ftplib import FTP 
import os 

ftp=FTP("ftp.xxxxxxxxx.com") 
ftp.login("xxxxxxxxxxx","xxxxxxxxxx") #login to FTP account
print "Successfully logged in"
ftp.cwd("public_html") #change working directory to \public_html\
filenames = ftp.nlst() #create variable to store contents of \public_html\

os.makedirs("C:\\Users\\xxxxxx\\Desktop\\Backup")#create local backup directory
os.chdir("C:\\Users\\xxxxxx\\Desktop\\Backup")#change working directory to local backup directory

#for loop to download each file individually
for a in filenames:
    ftp.retrbinary("RETR " + a, file.write)
    file.close()


ftp.close()  #CLOSE THE FTP CONNECTION
print "FTP connection closed. Goodbye"

I'm reluctant to run it as I don't want to create a problem on my website if it's wrong. I should note that the filename & extension of the local file should exactly match that of the remote file being downloaded.

Any guidance appreciated!

Community
  • 1
  • 1
arumiat
  • 123
  • 2
  • 9

1 Answers1

3

You don't need to change your working directory just save your files in your intended path.

And for downloading the files you first need to get the list of file names:

file_list = []
ftp.retrlines('LIST', lambda x: file_list.append(x.split()))

Then separate the files, from directories, and download them:

for info in file_list:
    ls_type, name = info[0], info[-1]
    if not ls_type.startswith('d'):
        with open(name, 'wb') as f:
            ftp.retrbinary('RETR {}'.format(f), f.write)
Mazdak
  • 105,000
  • 18
  • 159
  • 188