0

I'm currently trying to write an airflow job that will allow me to ssh into an EC2 instance and then start an sftp session with another host from within this EC2 box. My current code that I have is as follows:

def run_ssh():
    hook = SSHHook(ssh_conn_id='xyz').get_conn() #returns an ssh client
    stdin, stdout, stderr = hook.exec_command('sftp user@host.com;')
    # This next step prompts me for password so i provide it
    stdin.write('password')
    logging.info(stdout.readlines())
    stdin, stdout, stderr = hook.exec_command('ls')
    logging.info(stdout.readlines())

When i print the final line i should be seeing some folders but instead just see ['a\n']... so it seems I'm not actually able to sftp. Are there better ways to sftp from a remote host through a python script running locally.

Any help with this is appreciated. The answer can be geared towards a simple python script as opposed to airflow.

Martin Prikryl
  • 188,800
  • 56
  • 490
  • 992
Ash_s94
  • 787
  • 6
  • 19

1 Answers1

1

For your literal question, see:
Pass input/variables to command/script over SSH using Python Paramiko


Though implementing an SFTP over jump host this way is not a good solution.

Use port forwarding instead:

Martin Prikryl
  • 188,800
  • 56
  • 490
  • 992