I have a list of AWS ubuntu servers e.g
ubuntu@ec2-bla-95-blablabla-23.amazonaws.com
ubuntu@ec2-bla-95-blablabla-43.amazonaws.com
ubuntu@ec2-bla-95-blablabla-24.amazonaws.com
...
On each of these servers, I have a folder with variable number of files, the path is the same for each server e.g /roth/files/
I want to write a Python script that would fetch the contents of those files and merge them locally on my machine.
How do I go about fetching contents of those files on remote servers?
The way I login to these servers is
ssh -i path/aws.pem ubuntu@ec2-bla-95-blablabla-23.amazonaws.com
e.g using a key
I found answer on similar question here
sftp_client = ssh_client.open_sftp()
remote_file = sftp_client.open('remote_filename')
try:
for line in remote_file:
# process line
finally:
remote_file.close()
But I do not see where you provide a server name, and the key...
EDIT: As a small correction to Ganesh's answer you need to do the following to fetch every file or otherwise you get an error complaining that you try to fetch directory:
lobj = sftp.listdir_attr(target_folder_remote)
for o in lobj:
name = o.filename
sftp.get(os.path.join(target_folder_remote, name), os.path.join(target_folder_local, name))