I've got a Python program which sits on a remote server which uploads a file to an AWS bucket when run. If I ssh onto the server and run it with the command sudo python3 /path/to/backup.py
it works as expected.
I'm writing a Python program to automate a bigger process which includes running backup.py. I created a function to do this using the paramiko library. This is where the command gets run
ssh_stdin, ssh_stdout, ssh_stderr = self.ssh.exec_command('sudo python3 /path/to/backup.py', 1800)
logging.debug(f'ssh_stdout: {ssh_stdout.readline()}')
logging.debug(f'ssh_stderr: {ssh_stderr.readline()}')
My automation gives me this output:
ssh_stdout: Tue, 19 May 2020 14:36:43 INFO The COS endpoint is 9.11.200.206, writing to vault: SD_BACKUP_4058
The program doesn't do anything after that. When I log onto the server and check the logs of backup.py
, I can see that it is still running and seems to be sitting at the file upload. This is the code it's stuck at:
s3_client.upload_file(
Filename=BACKUP,
Bucket=BUCKET_NAME,
Key=SPLIT_FILE_NAME,
Callback=pp(BACKUP),
Config=config)
I can't understand why it's getting stuck here when started by my automation program and not when I run it from a command line in the terminal. I can't see anything in the logs which help me. It just seems to be stuck at that point in its execution. Could it be something to do with the callback not getting returned?