1

I have the simplified following code in Python:

proc_args = "gzip --force file; echo this_still_prints > out"
post_proc = subprocess.Popen(proc_args, shell=True)

while True:
    time.sleep(1)

Assume file is big enough to take several seconds do process. If I close the Python process while gzip is still running, it will cause gzip to end, but it will still execute the following line to gzip. I'd like to know why this happens, and if there's a way I can make it to not continue executing the following commands.

Thank you!

3 Answers3

2

A process exiting does not automatically cause all its child processes to be killed. See this question and its related questions for much discussion of this.

gzip exits because the pipe containing its standard input gets closed when the parent exits; it reads EOF and exits. However, the shell that's running the two commands is not reading from stdin, so it doesn't notice this. So it just continues on and executes the echo command (which also doesn't read stdin).

Community
  • 1
  • 1
Barmar
  • 741,623
  • 53
  • 500
  • 612
  • You're right Barmar. Thank you very much for the clarification. Now I understand what's happening, I guess for now I will have to split the commands so that does not happen anymore and add some code to manage that. This info is very useful, thanks! – Davi Duarte Pinheiro Oct 02 '12 at 00:24
0

post_proc.kill() I believe is what you are looking for ... but afaik you must explicitly call it

see: http://docs.python.org/library/subprocess.html#subprocess.Popen.kill

Joran Beasley
  • 110,522
  • 12
  • 160
  • 179
  • My program have a signal handler, so this will surely helps prevent this problem when I receive a signal to close. The problem is I can't afford doing that if the program crashs. – Davi Duarte Pinheiro Oct 01 '12 at 23:43
  • it shouldnt crash is the obvious answer :P ... in reality you have to handle this explicity ... I dont think there is a way around it... maybe put your whole program in a Try/Except/Finally and close all processes in the finally... that way its pretty much guaranteed to close... – Joran Beasley Oct 01 '12 at 23:45
  • You're right, that would be the optimal solution, but the machine can crash, there might be a power shortage and other catastrophic events. I am concerned because I will be handling critical data, so I need to be ready for these kinds of events. But your solution will sure reduce the risks a lot, thank you very much! – Davi Duarte Pinheiro Oct 01 '12 at 23:51
  • Well, in fact machine crash wouldn't be a problem as it wouldn't run the next command anyway haha, my bad. – Davi Duarte Pinheiro Oct 01 '12 at 23:52
  • I think if you wrap the whole thing in try/except/finally and kill them in the finally it should work in all instances that are reasonable ... – Joran Beasley Oct 01 '12 at 23:53
  • Just created a signal handler that do post_proc.kill() and it's now working as a charm! Thank you a lot Joran, you saved my day! =] – Davi Duarte Pinheiro Oct 01 '12 at 23:58
  • By the way, I thought it was working, it was just a bad test, tested again, it's not working, still executes the following function, I think what Barmar said happened. – Davi Duarte Pinheiro Oct 02 '12 at 00:20
0

I use try-finally in such cases (unfortunately you cannot employ with like you would in file.open()):

proc_args = "gzip --force file; echo this_still_prints > out"
post_proc = subprocess.Popen(proc_args, shell=True)

try:
    while True:
        time.sleep(1)
finally:
    post_proc.kill()
javaxian
  • 1,815
  • 1
  • 21
  • 26