There is this answer How to use paramiko to transfer files between two remote servers?
but it does not really answer this question.
What I'm thinking of doing, is to initiate two connections from third location (my laptop for example) and transfering some files from one location to another without a need to download those files locally first.
So first, is this even possible? If so, does it improve time at all compared with time needed to download it locally and then upload it to another remote? And I don't want to put script on remote. I want to control transfer from my computer.
I already have implementation to take files from one remote, download it locally and then upload to remote. But I need to move lots of files, which can take time, so wondering if there is maybe better approach to that.
So I have this implementation (which does not work, because it expects local path):
def rpath_exists(path, sftp):
"""Check if remote path exists.
Checking is done using paramiko connection.
"""
try:
sftp.stat(path)
except IOError as e:
if e.errno == errno.ENOENT:
return False
raise
else:
return True
def _is_rfile(remote_file_obj):
"""Check if remote path is file."""
if stat.S_ISDIR(remote_file_obj.st_mode):
return False
return True
def transfer_dir(source, target, sftp_from, sftp_to, create_dir=True):
"""Transfer directory between two remotes using sftps."""
if create_dir:
_create_rdir(target, sftp_to)
for item in sftp_from.listdir_attr(source):
filename = item.filename
source_path = os.path.join(source, filename)
target_path = os.path.join(target, filename)
if _is_rfile(item):
# obviously this won't work, because it expects local path,
# but I'm specifying remote path from sftp_from.
sftp_to.put(source_path, target_path)
else:
if not rpath_exists(target_path, sftp_to):
sftp_to.mkdir(target_path)
# We specify create_dir=False, because only root dir can
# be created.
transfer_dir(
source_path, target_path, sftp_from, sftp_to, create_dir=False)