0

I have a script which can run on my host machine and several other servers. I want to launch this script as a background process on my host machine along with the remote machine using ssh and output the stdout/stderr to host machine for my host machine background process and on the remote machines for remote machine background tasks.

I tried with

subprocess.check_output(['python' ,'script.py' ,'arg_1', ' > file.log ', ' & echo -ne $! ']

but it doesn't work. it doesnt give me the pid nor write into the file. It works with shell=True but then I read it is not good to use shell=True for security reasons.

then I tried

p = subprocess.Popen(['python' ,'script.py' ,'arg_1', ' > file.log ']

Now i can get the process pid but the output is not writing in the remote log file.

using stdout/stderr arguments like suggested below will open the log file in my host machine not the remote machine. i want to log on the remote machine instead. append subprocess.Popen output to file?

Could someone please suggest me a single command that works both on my host machine and also ssh's to remote server and launches the background process there? and write to output file ?

<HOW_TO_GET_PID> = subprocess.<WHAT>( ([] if 'localhost' else ['ssh','<remote_server>']) + ['python', 'script.py', 'arg_1' <WHAT>] )

Someone could please finish the above psudo code ?

Thanks,

Charles Duffy
  • 280,126
  • 43
  • 390
  • 441
user4772933
  • 311
  • 3
  • 13
  • Have you tried capturing the output using `stdout=PIPE, stderr=PIPE` then `.communicate()`? – S3DEV Sep 03 '20 at 13:30
  • https://stackoverflow.com/a/7224186/4772933 communicate is blocking and like I have told I want to launch a background task.. So cant use it – user4772933 Sep 03 '20 at 13:37
  • `>`, `&`, etc. are all shell directives. They're meaningless unless passed to a shell. – Charles Duffy Sep 03 '20 at 14:09
  • However, you _don't need_ those directives: You can tell `subprocess` to do the same thing directly. For example, `stdout=open('somefile', 'w')` instead of putting `>somefile` in the command. – Charles Duffy Sep 03 '20 at 14:10
  • BTW, note that requests for "one line" answers typically compromise readability, correctness, or both. Stack Overflow's scope limits it to _practical_ questions; code that isn't readable or correct is not practical to put to real-world mission-critical use. – Charles Duffy Sep 03 '20 at 14:12
  • @CharlesDuffy thanks it will work for the host machine but what about the remote machine? if i want to log there? By log I mean the output of script.py – user4772933 Sep 03 '20 at 14:13
  • ...beyond that, you've got several different questions together here that should be asked individually. "How can I conditionally send my subprocess over ssh?" is one question (and a good one, there are some tricky bits that are hard to get right!); "How do I send subprocess output to a file?" is another, etc. – Charles Duffy Sep 03 '20 at 14:13
  • In the remote-machine case, you have no choice but to enlist the help of a remote shell; `ssh` _always_ runs such a shell, so you're in a good place wrt. ability to start it. – Charles Duffy Sep 03 '20 at 14:14
  • ...in terms of taking an argument vector and turning it into a command ssh can handle, though -- the Right Thing is to use `shlex.quote()` (or in Python 2, `pipes.quote()`) to escape content that _isn't_ expected to be treated as shell syntax before adding it to the command ssh is going to be asked to run. – Charles Duffy Sep 03 '20 at 14:14
  • Here is a related [question/answer](https://stackoverflow.com/a/7389473/6340496), very close to the answer given by @Dalen. Might be of use. – S3DEV Sep 03 '20 at 14:15

2 Answers2

0

# At the beginning you can even program automatic daemonizing
# Using os.fork(), otherwise, you run it with something like:
# nohup python run_my_script.py &
# This will ensure that it continues running even if SSH connection breaks.
from subprocess import Popen, PIPE, STDOUT

p = Popen(["python", "yourscript.py"], stdout=PIPE, stderr=STDOUT, stdin=PIPE)
p.stdin.close()
log = open("logfile.log", "wb")
log.write(b"PID: %i\n\n" % p.pid)
while 1:
    line = p.stdout.readline()
    if not line: break
    log.write(line)
    log.flush()

p.stdout.close()
log.write(b"\nExit status: %i" % p.poll())
log.close()

Dalen
  • 4,128
  • 1
  • 17
  • 35
  • Nice. Quick correction: `stder` should be `stderr`. – S3DEV Sep 03 '20 at 13:48
  • Yep, I just noticed that. :D – Dalen Sep 03 '20 at 13:49
  • @Dalen thanks but i want to log on the remote machine not on the host machine. And the script launching this background task can't while for ever, it is doing other tasks aswell. Ist is not what I ask for in the question. thanks though – user4772933 Sep 03 '20 at 13:50
  • @Dalen I already have seen this code and it is not what I want. If i want to use multiple threads and complicate a line of code then I will come back to you for sure. And thanks for sharing your wisdom. instead of writing a paragraph of wisdom if you could write me a solution in 1 line then that would be really appreciable. But nonetheless many thanks for your help. – user4772933 Sep 03 '20 at 14:08
  • 2
    Why `stdout=PIPE`, instead of `stdout=open("logfile.log", "wb")`? Then everything will go to your file _directly_ and you don't need to be in the middle. – Charles Duffy Sep 03 '20 at 14:10
  • @Dalen not sure but just checking, by log I mean logging the output of script.py. i am not getting why you would want me to open the while on the host and then ssh to remote server execute a background process there.. pipe the output from remote to the host ?? That doesn't makes any sense. – user4772933 Sep 03 '20 at 14:14
  • @CharlesDuffy : A leftover from days when using it wasn't reliable. Anyway, you have more control this way, and you do file.flush() for each output line so that you can read what is happening when you open the file while script is running. You can certainly do as you suggest, and much of the OP's dilemmas are solved with it, but I wouldn't do it that way. You have more possibilities for detecting other problems with the child process this way. – Dalen Sep 03 '20 at 14:27
  • @CharlesDuffy As I read the Q, the same script should be launched on multiple computers, monitored there and the log should be saved there. A script per PC and its logfile. So you put my script on each computer and start it on each where it launches the script.py and monitors it. If there is a need to start the script.py on each PC from the host and get the log to the host, well, as I said, then using paramico will do the trick although there are, as I wrote, many tools for synchronized startup and monitoring of processes over SSH. – Dalen Sep 03 '20 at 14:46
  • @CharlesDuffy sorry, the last comment was intended for user4772933 . It seems we don't understand eachother – Dalen Sep 03 '20 at 14:59
  • @user4772933 O, BTW, multithreading is nothing complicated or complex. If you do not need the log to be written in realtime, you can use the p.communicate() within a thread. And then it doesn't matter that it is a blocking function. You just write a function that will be executed in a thread, i.e. non-blockingly, and start it as a thread using threading or thread/_thread module. Just ensure that you have a way of interrupting it, in my codes case, not while 1, but some global variable instead which you would set to 0/False in case of e.g. KeyboardInterrupt in a main thread or whatever. – Dalen Sep 03 '20 at 15:13
0

You're not going to get something that's safe and correct in a one-liner without making it unreadable; better not to try.

Note that we're using a shell here: In the local case we explicitly call shell=True, whereas in the remote case ssh always, implicitly starts a shell.

import shlex
import subprocess

def startBackgroundCommand(argv, outputFile, remoteHost=None, andGetPID=False):
    cmd_str = ' '.join(shlex.quote(word) for word in argv)
    if outputFile != None:
        cmd_str += ' >%s' % (shlex.quote(outputFile),)
    if andGetPID:
        cmd_str += ' & echo "$!"'
    if remoteHost != None:
        p = subprocess.Popen(['ssh', remoteHost, cmd_str], stdout=subprocess.PIPE)
    else:
        p = subprocess.Popen(cmd_str, stdout=subprocess.PIPE, shell=True)
    return p.communicate()[0]

# Run your command locally
startBackgroundCommand(['python', 'script.py', 'arg_1'],
    outputFile='file.log', andGetPID=True)

# Or run your command remotely
startBackgroundCommand(['python', 'script.py', 'arg_1'],
    remoteHost='foo.example.com', outputFile='file.log', andGetPID=True)
Charles Duffy
  • 280,126
  • 43
  • 390
  • 441
  • thanks, i understood that from your comments that i would have to use 2 separate use cases 1 for ssh and 1 for locally. I could make it work. thanks for your help – user4772933 Sep 03 '20 at 15:17