I try to backup a server using Paramiko and SSH to call a tar
command. When there is a limited number of files, all works well but when it's a big folder, the script wait endlessly. The following test shows me that the problem comes from the size of the stdout.
Is there a way to correct that and execute this kind of command?
Case big output:
query = 'cd /;ls -lshAR -h'
chan.exec_command(query)
while not chan.recv_exit_status():
if chan.recv_ready():
data = chan.recv(1024)
while data:
print data
data = chan.recv(1024)
if chan.recv_stderr_ready():
error_buff = chan.recv_stderr(1024)
while error_buff:
print error_buff
error_buff = chan.recv_stderr(1024)
exist_status = chan.recv_exit_status()
if 0 == exist_status:
break
Result is (not ok - block - die ??)
2015-07-25 12:57:07,402 --> Query sent
Case small output:
query = 'cd /;ls -lshA -h'
chan.exec_command(query)
while not chan.recv_exit_status():
if chan.recv_ready():
data = chan.recv(1024)
while data:
print data
data = chan.recv(1024)
if chan.recv_stderr_ready():
error_buff = chan.recv_stderr(1024)
while error_buff:
print error_buff
error_buff = chan.recv_stderr(1024)
exist_status = chan.recv_exit_status()
if 0 == exist_status:
break
Result is (all is ok)
2015-07-25 12:55:08,205 --> Query sent
total 172K
4.0K drwxr-x--- 2 root psaadm 4.0K Dec 27 2013 archives
0 -rw-r--r-- 1 root root 0 Jul 9 23:49 .autofsck
0 -rw-r--r-- 1 root root 0 Dec 27 2013 .autorelabel
4.0K dr-xr-xr-x 2 root root 4.0K Dec 23 2014 bin
2015-07-25 12:55:08,307 --> Query executed (0.10)
Put on GitHub:
https://github.com/paramiko/paramiko/issues/563