1

I'm working on testing a corosync cluster. I'm trying to fail the interface that has the floating-IP to ensure the resource migrates over to another node with python.

Now the dilemma is my command does execute on the remote machine, but my test code hangs forever waiting for a reply it will never get--thenode will get rebooted because of the injected failure.

ssh = SSHClient(self.get_ms_ip(ms),
                        self.get_ms_user(ms),
                        self.get_ms_password(ms))
ssh.connect()
self.logger.info("Failing FIP eth now on %s" % ms)
ssh.exec_command(cmd, timeout=1)
#Code never reached this comment.

In python, how can I send the command and just continue on without waiting for any return? I've tried wrapping my ssh.exec_command with subprocess.Popen as suggested here Run Process and Don't Wait but that didn't yield anything different.

Community
  • 1
  • 1
user597608
  • 387
  • 8
  • 20
  • Similarly worded question – although about different problem – not waiting for long/infinite commands – I believe most users who come here will actually look for that: [Do not wait for commands to finish when executing them with Python Paramiko](https://stackoverflow.com/q/66032028/850848). – Martin Prikryl Feb 10 '21 at 07:10

3 Answers3

2

You don't want a subprocess, you want a thread. Spawn a thread that runs the exec_command call and you'll be able to continue with your code.

Colin vH
  • 527
  • 3
  • 9
0

Did you try nohup?

ssh.exec_command('nohup %s &'%cmd, timeout=1)
HayatoY
  • 527
  • 1
  • 4
  • 13
  • Tried nohup and the code does continue but now my cmd is failing. Following is my cmd: cmd = "echo %s | sudo -S ifdown eth3" % self.get_ms_root_password(ms) – user597608 Nov 29 '14 at 18:51
  • try this wrapper. easier to exec sudo commands. http://stackoverflow.com/a/22592827/2854735 – HayatoY Nov 29 '14 at 23:36
0

Python doesn't handle threads nicely; can't manually exit a thread. I ended up having to make a worker method that would create the shh connection and run exec_command that would be run as a seperate multiprocessing.Process.

This way I was able to cleanup after a test properly before the next test ran (as part of python's unit test framework).

user597608
  • 387
  • 8
  • 20