-1

I'm building a serverless service (AWS lambda) to get/put files in SFTP server on a daily basis.

The only way to access the SFTP server is from another machine (an AWS EC2 instance) which has it's IP whitelisted at the SFTP server.

I did try to use Python SSHTunnelForwarder and paramiko libraries, but I'm not sure if it's possible bind SSH (tunnel forwarding) and SFTP in a single solution.

Script

with SSHTunnelForwarder(
   (ssh_instance_ip, 22),
   ssh_username=ssh_user,
   ssh_private_key='<key_path>',
    remote_bind_address=('0.0.0.0', 22),
    local_bind_address=('127.0.0.1', 10022),
   ) as tunnel:

    tunnel.start()

    client = paramiko.SSHClient()
    client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
    client.connect(password=sftp_pwd, username=sftp_user, hostname=sftp_host, port='<port_value>', allow_agent=True, disabled_algorithms=dict(pubkeys=["rsa-sha2-512", "rsa-sha2-256"]))

    tr = client.get_transport()
    tr.default_max_packet_size = 100000000
    tr.default_window_size = 100000000

    sftp = client.open_sftp()

    sftp.listdir('/')

    sftp.close

    client.close()

Error:

...
DEBUG:paramiko.transport:Sending global request "keepalive@lag.net"
DEBUG:paramiko.transport:[chan 0] EOF sent (0)
DEBUG:paramiko.transport:EOF in transport thread
Traceback (most recent call last):
  File "/home/linuxbrew/.linuxbrew/Cellar/python@3.10/3.10.9/lib/python3.10/site-packages/paramiko/sftp_client.py", line 852, in _read_response
    t, data = self._read_packet()
  File "/home/linuxbrew/.linuxbrew/Cellar/python@3.10/3.10.9/lib/python3.10/site-packages/paramiko/sftp.py", line 201, in _read_packet
    x = self._read_all(4)
  File "/home/linuxbrew/.linuxbrew/Cellar/python@3.10/3.10.9/lib/python3.10/site-packages/paramiko/sftp.py", line 188, in _read_all
    raise EOFError()
EOFError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/work/scratch/tunnel_test/em_tunnel/app/tunnel_test.py", line 71, in <module>
    sftp.listdir('/')
  File "/home/linuxbrew/.linuxbrew/Cellar/python@3.10/3.10.9/lib/python3.10/site-packages/paramiko/sftp_client.py", line 218, in listdir
    return [f.filename for f in self.listdir_attr(path)]
  File "/home/linuxbrew/.linuxbrew/Cellar/python@3.10/3.10.9/lib/python3.10/site-packages/paramiko/sftp_client.py", line 239, in listdir_attr
    t, msg = self._request(CMD_OPENDIR, path)
  File "/home/linuxbrew/.linuxbrew/Cellar/python@3.10/3.10.9/lib/python3.10/site-packages/paramiko/sftp_client.py", line 822, in _request
    return self._read_response(num)
  File "/home/linuxbrew/.linuxbrew/Cellar/python@3.10/3.10.9/lib/python3.10/site-packages/paramiko/sftp_client.py", line 854, in _read_response
    raise SSHException("Server connection dropped: {}".format(e))
paramiko.ssh_exception.SSHException: Server connection dropped:

I did look for several similar topics and could not find a way to bind SSH and SFTP in a single solution to use in serverless service as AWS Lambda.

Could you help me trying either debugging or suggesting ways (maybe a bash or JavaScript) to bind SSH and SFTP, please?


additional information:

  • EC2 access is done via user@host and a RSA PRIVATE KEY
  • SFTP access is done via user@host and a password
  • What is ``? + See [Nested SSH using Python Paramiko](https://stackoverflow.com/q/35304525/850848). – Martin Prikryl Jan 25 '23 at 16:48
  • `` is the port of the SFTP server that can vary. Not all the SFTP servers I need to reach use the standard port 22. Thanks for sharing the [Nested SSH link](https://stackoverflow.com/q/35304525/850848) @martinprikryl. I'm still having many issues using the nested code your proposed. It might be the configuration either from my jump-host or the SFTP server. However I managed to use ssh2 lib from JavaScript is it's working pretty fine. – felipe_franceschini Feb 03 '23 at 16:01

1 Answers1

0

I've found a solution using ssh2 lib from JavaScript.

The following script access the SFTP server with a user and password jumping over another machine (an ec2 instance in my case) accessible using an user and a RSA KEY.

The script will list the content of the requested path from the SFTP server and write it's result into my.json file.

Script adapted from those examples ssh2 - npm.

const fs = require('fs');
const path = require('path');
const Client = require('ssh2').Client;

// SSH tunnel details
const sshTunnelIp = "ssh_host";
const sshUser = "ssh_user";
const sshKeyPath = path.resolve(__dirname, "path_to_your_key");
const sshKey = fs.readFileSync(sshKeyPath);

// SFTP server details
const sftpServerIp = "sftp_host";
const sftpUser = "sftp_user";
const sftpPass = "sftp_password";

// Create an SSH client
const sshClient = new Client();

// Connect to the SSH tunnel (EC2 instance)
sshClient.connect({
  host: sshTunnelIp,
  username: sshUser,
  privateKey: sshKey
});

sshClient.on('ready', function() {
  // Create an SFTP client and connect to the SFTP server
  sshClient.sftp(function(err, sftp) {
    if (err) throw err;

    // List the content of a given path from the SFTP server
    sftp.readdir('path_to_sftp_dir', function(err, list) {
      if (err) throw err;
      
      // Write the result to a json file
      fs.writeFile(
        './my.json',

        JSON.stringify(list),

        function (err) {
          if (err){
            console.error('Something went wrong');
          }
        }
      )

      console.log("listing directory", list);

      sftp.end();
      sshClient.end();
    });
  });
});