4

I am new to Python.

I am trying to SSH to a server to perform some operations. However, before performing the operations, i need to load a profile, which takes 60-90 seconds. After loading the profile, is there a way to keep the SSH session open so that i can perform the operations later?

p = subprocess.Popen("ssh abc@xyz'./profile'", stdout=subprocess.PIPE, shell=True)
result = p.communicate()[0]
print result
return result

This loads the profile and exits. Is there a way to keep the above ssh session open and run some commands?

Example:

 p = subprocess.Popen("ssh abc@xyz'./profile'", stdout=subprocess.PIPE, shell=True)
    <More Python Code>
    <More Python Code>
    <More Python Code>
 <Run some scripts/commands on xyz server non-interactively> 

After loading the profile, I want to run some scripts/commands on the remote server, which I am able to do by simply doing below:

 p = subprocess.Popen("ssh abc@xyz './profile;**<./a.py;etc>**'", stdout=subprocess.PIPE, shell=True)

However, once done, it exists and the next time I want to execute some script on the above server, I need to load the profile again (which takes 60-90 seconds). I am trying to figure out a way where we can create some sort of tunnel (or any other way) where the ssh connection remains open after loading the profile, so that the users don't have to wait 60-90 seconds whenever anything is to be executed.

I don't have access to strip down the profile.

Koshur
  • 378
  • 1
  • 6
  • 20
  • 1
    this is doing exactly what you told him to do - connect via ssh and load profile, that is why it closes after it complete. in order to keep it open you need something like this: `ssh -t abc@xyz "bash -l"`. – ddor254 Dec 19 '17 at 13:22
  • Can you please explain what this does? – Koshur Dec 19 '17 at 13:28
  • from what i remember it leave the ssh open in bash terminal – ddor254 Dec 19 '17 at 13:28
  • After loading the profile, I want to run some scripts/commands on the remote server non-interactively. Not via terminal.. – Koshur Dec 19 '17 at 13:36
  • try https://rpyc.readthedocs.io/en/latest/ instead of ssh – ddor254 Dec 19 '17 at 13:43
  • IMHO you are chasing the wrong rabbit here. Maybe you could find a way to keep an open channel, but it will certainly be harder that what you seem to expect, because you will have to ready to suffer output buffering from the remote. My advice is to strip down the profile to only what is required for your commands. It should be much simpler... Of course, your interactiver profile will be much longer, but it could source the non-interactive one to avoid duplication. Not what you asked for, so only a comment, but you really should think about that way. – Serge Ballesta Dec 19 '17 at 16:33
  • @SergeBallesta Unfortunately I don't have access to strip down the profile. – Koshur Dec 19 '17 at 16:55
  • @Koshur: you mean that you cannot copy the profile in your home directory to a **new** file and in that new file keep only what is relevant for your non interactive commands??? – Serge Ballesta Dec 19 '17 at 17:00
  • https://stackoverflow.com/questions/8475290/how-do-i-write-to-a-python-subprocess-stdin – Nadeem Taj Sep 02 '21 at 18:51

4 Answers4

1

You have to construct a ssh command like this ['ssh', '-T', 'host_user_name@host_address'] then follow below code.

Code:

from subprocess import Popen, PIPE

ssh_conn = ['ssh', '-T', 'host_user_name@host_address']

# if you have to add port then ssh_conn should be as following
# ssh_conn = ['ssh', '-T', 'host_user_name@host_address', '-p', 'port']

commands = """            
    cd Documents/
    ls -l
    cat test.txt
"""

with Popen(ssh_conn, stdin=PIPE, stdout=PIPE, stderr=PIPE, universal_newlines=True) as p:
    output, error = p.communicate(commands)
    print(output)
    print(error)
    print(p.returncode)
    # or can do following things
    p.stdin.write('command_1')
    # add as many command as you want
    p.stdin.write('command_n')

Terminal Output:

enter image description here

Please let me know if you need further explanations.

N.B: You can add command in commands string as many as you want.

Sabil
  • 3,750
  • 1
  • 5
  • 16
  • I cannot see in your answer, how to use multiple requests to ssh and get the outputs to process? Can I use `output, error = p.communicate(commands)` multiple times here without ssh login/logout again? – Asara Aug 28 '21 at 08:29
  • You can add multiple command in command string and use this `output, error = p.communicate(commands)`. Moreover you can also use `output, error = p.communicate(commands)` multiple times under `with` block – Sabil Aug 28 '21 at 08:54
  • why not you give a try and let me know if you face any issue – Sabil Aug 28 '21 at 08:56
  • I will, however, you should show an example in your answer, so we have a good answer here :) – Asara Aug 28 '21 at 10:54
  • I already did that. I add multiple commands here. – Sabil Aug 28 '21 at 11:41
  • its multiple commands, but this I can also do with `cd Documents; ls -l`. But I cannot run `p.communicate()` multiple times because I will get `ValueError: Cannot send input after starting communication` so I cant run a command, get an output, run another command and get again an output with the same connection – Asara Aug 29 '21 at 16:45
  • I can solve it in ssh configs with `Host * ControlMaster auto ControlPersist yes ControlPath ~/.ssh/socket-%r@%h:%p` but of course Iam looking for a python programmatical solution here – Asara Aug 29 '21 at 16:48
  • Could you please add more details what you want and what you get so far? – Sabil Aug 29 '21 at 17:56
  • If I am guessing correct then you want to communicate multiple times with a process without breaking the pipe, right? – Sabil Aug 29 '21 at 18:18
  • @Asara refer to https://superuser.com/questions/931574/how-do-i-know-when-a-command-run-over-ssh-has-finished , it's impossible to know in advance where the end of the command's output will be. Therefore you can implement technique there (echo "=== COMMAND HAS FINISHED ===") then read the output to that point. – attempt0 Aug 31 '21 at 12:47
1

Try an ssh library like asyncssh or spur. Keeping the connection object should keep the session open.

You could send a dummy command like date to prevent the timeout as well.

Shanavas M
  • 1,581
  • 1
  • 17
  • 24
0

What you want to do is write/read to the process's stdin/stdout.

from subprocess import Popen, PIPE
import shlex

shell_command = "ssh user@address"
proc = Popen(shlex.split(shell_command), stdin=PIPE, universal_newlines=True)

# Do python stuff here

proc.stdin.write("cd Desktop\n")
proc.stdin.write("mkdir Example\n")

# And so on

proc.stdin.write("exit\n")

You must include the trailing newline for each command. If you prefer, print() (as of Python 3.x, where it is a function) takes a keyword argument file, which allows you to forget about that newline (and also gain all the benefits of print()).

print("rm Example", file=proc.stdin)

Additionally, if you need to see the output of your command, you can pass stdout=PIPE and then read via proc.stdout.read() (same for stderr).

You may also want to but the exit command in a try/finally block, to ensure you exit the ssh session gracefully.

Note that a) read is blocking, so if there's no output, it'll block forever and b) it will only return what was available to read from the stdout at that time- so you may need to read repeatedly, sleep for a short time, or poll for additional data. See the fnctl and select stdlib modules for changing blocking -> nonblocking read and polling for events, respectively.

Dillon Davis
  • 6,679
  • 2
  • 15
  • 37
0

Hello Koshur!

I think that what you are trying to achieve looks like what I've tried in the past when trying to make my terminal accessible from a private website:

I would open a bash instance, keep it open and would listen for commands through a WebSocket connection.

What I did to achieve this was using the O_NONBLOCK flag on STDOUT.

Example

import fcntl
import os
import shlex
import subprocess

current_process = subprocess.Popen(shlex.split("/bin/sh"), stdin=subprocess.PIPE,
                                   stdout=subprocess.PIPE, stderr=subprocess.STDOUT)  # Open a shell prompt
fcntl.fcntl(current_process.stdout.fileno(), fcntl.F_SETFL,
            os.O_NONBLOCK)  # Non blocking stdout and stderr reading

What I would have after this is a loop checking for new output in another thread:

from time import sleep
from threading import Thread

def check_output(process):
    """
    Checks the output of stdout and stderr to send it to the WebSocket client
    """
    while process.poll() is None: # while the process isn't exited
        try:
            output = process.stdout.read() # Read the stdout PIPE (which contains stdout and stderr)
        except Exception:
            output = None
        if output:
            print(output)
        sleep(.1)
    # from here, we are outside the loop: the process exited
    print("Process exited with return code: {code}".format(code=process.returncode))

Thread(target=check_output, args=(current_process,), daemon=True).start() # Start checking for new text in stdout and stderr

So you would need to implement your logic to SSH when starting the process:

current_process = subprocess.Popen(shlex.split("ssh abc@xyz'./profile'"), stdin=subprocess.PIPE,
                                   stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

And send commands like so:

def send_command(process, cmd):
    process.stdin.write(str(cmd + "\n").encode("utf-8")) # Write the input to STDIN
    process.stdin.flush() # Run the command

send_command(current_process, "echo Hello")

EDIT

I tried to see the minimum Python requirements for the given examples and found out that Thread(daemon) might not work on Python 2.7, which you asked in the tags.

If you are sure to exit the Thread before exiting, you can ignore daemon and use Thread() which works on 2.7. (You could for example use atexit and terminate the process)

References