1

I am trying to automate the transfer of files from an SFTP sever through Airflow, but I am running into a problem where I cannot save the files anywhere except to the root folder.

Here is a snippet of my code:

cnopts = pysftp.CnOpts()
cnopts.hostkeys = None
with pysftp.Connection(host=myHostname, username=myUsername, password=myPassword,     private_key=".ppk",
 cnopts=cnopts) as sftp:
     print("Connection succesfully stablished ... ")
     # Obtain structure of the remote directory '/var/www/vhosts'
     directory_structure = sftp.listdir_attr()
     #Download data
     for attr in directory_structure:
         if datetime.fromtimestamp(attr.st_mtime) >= datetime.strptime("2022-01-18", "%Y-%m-%d"):
             sftp.put(attr.filename,"/volume1/homes/[myuser]/Recordings/{a}".format(a=attr.filename))

When I run this DAG, I get the following error

FileNotFoundError: [Errno 2] No such file or directory:

Any advice on how to fix this would be greatly appreciated.

Martin Prikryl
  • 188,800
  • 56
  • 490
  • 992
biomed
  • 23
  • 2

1 Answers1

0

Paramiko SFTPClient.put is for upload.

If you want to download, you have to use SFTPClient.get:

sftp.get(attr.filename,"/volume1/homes/[myuser]/Recordings/{a}".format(a=attr.filename))

Obligatory warning: Do not set cnopts.hostkeys = None, unless you do not care about security. For the correct solution see Verify host key with pysftp.

Martin Prikryl
  • 188,800
  • 56
  • 490
  • 992