I'm building a serverless service (AWS lambda) to get/put files in SFTP server on a daily basis.
The only way to access the SFTP server is from another machine (an AWS EC2 instance) which has it's IP whitelisted at the SFTP server.
I did try to use Python SSHTunnelForwarder and paramiko libraries, but I'm not sure if it's possible bind SSH (tunnel forwarding) and SFTP in a single solution.
Script
with SSHTunnelForwarder(
(ssh_instance_ip, 22),
ssh_username=ssh_user,
ssh_private_key='<key_path>',
remote_bind_address=('0.0.0.0', 22),
local_bind_address=('127.0.0.1', 10022),
) as tunnel:
tunnel.start()
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(password=sftp_pwd, username=sftp_user, hostname=sftp_host, port='<port_value>', allow_agent=True, disabled_algorithms=dict(pubkeys=["rsa-sha2-512", "rsa-sha2-256"]))
tr = client.get_transport()
tr.default_max_packet_size = 100000000
tr.default_window_size = 100000000
sftp = client.open_sftp()
sftp.listdir('/')
sftp.close
client.close()
Error:
...
DEBUG:paramiko.transport:Sending global request "keepalive@lag.net"
DEBUG:paramiko.transport:[chan 0] EOF sent (0)
DEBUG:paramiko.transport:EOF in transport thread
Traceback (most recent call last):
File "/home/linuxbrew/.linuxbrew/Cellar/python@3.10/3.10.9/lib/python3.10/site-packages/paramiko/sftp_client.py", line 852, in _read_response
t, data = self._read_packet()
File "/home/linuxbrew/.linuxbrew/Cellar/python@3.10/3.10.9/lib/python3.10/site-packages/paramiko/sftp.py", line 201, in _read_packet
x = self._read_all(4)
File "/home/linuxbrew/.linuxbrew/Cellar/python@3.10/3.10.9/lib/python3.10/site-packages/paramiko/sftp.py", line 188, in _read_all
raise EOFError()
EOFError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/work/scratch/tunnel_test/em_tunnel/app/tunnel_test.py", line 71, in <module>
sftp.listdir('/')
File "/home/linuxbrew/.linuxbrew/Cellar/python@3.10/3.10.9/lib/python3.10/site-packages/paramiko/sftp_client.py", line 218, in listdir
return [f.filename for f in self.listdir_attr(path)]
File "/home/linuxbrew/.linuxbrew/Cellar/python@3.10/3.10.9/lib/python3.10/site-packages/paramiko/sftp_client.py", line 239, in listdir_attr
t, msg = self._request(CMD_OPENDIR, path)
File "/home/linuxbrew/.linuxbrew/Cellar/python@3.10/3.10.9/lib/python3.10/site-packages/paramiko/sftp_client.py", line 822, in _request
return self._read_response(num)
File "/home/linuxbrew/.linuxbrew/Cellar/python@3.10/3.10.9/lib/python3.10/site-packages/paramiko/sftp_client.py", line 854, in _read_response
raise SSHException("Server connection dropped: {}".format(e))
paramiko.ssh_exception.SSHException: Server connection dropped:
I did look for several similar topics and could not find a way to bind SSH and SFTP in a single solution to use in serverless service as AWS Lambda.
Could you help me trying either debugging or suggesting ways (maybe a bash or JavaScript) to bind SSH and SFTP, please?
additional information:
- EC2 access is done via user@host and a RSA PRIVATE KEY
- SFTP access is done via user@host and a password