I wrote a function fun0
that calls:
- a subprocess
p1
, - a function
fun1
, - and a function
fun2
that calls another processp2
.
The two processes p1
and p2
are external files. The code of the function fun0
is:
def fun0():
# call the 1. process
p1 = subprocess.Popen(["python", "script1.py"])
try:
p1.wait()
except KeyboardInterrupt:
try:
p1.terminate()
except OSError:
pass
p1.wait()
# call the 1. function
fun1()
# loop 3 times
for i in range(1, 3, 1):
# call the 2. function
fun2()
def fun2():
# call 2. process
p2 = subprocess.Popen(["python", "script2.py"])
try:
p2.wait()
except KeyboardInterrupt:
try:
p2.terminate()
except OSError:
pass
p2.wait()
The script_2.py uses threading to run two functions at the same time. The code is as follows:
import threading
def Ref():
read ref. value
return ref. value
def Read():
while read_interval <= some_time:
read value
yield value
def Add():
while delta > error:
while delta > limit :
while True:
value = next(Read())
delta = change delta
check conditions
while True:
value = next(Read())
delta = change delta
check conditions
return result
if __name__ == '__main__':
t0 = threading.Thread(target = Ref)
t0.start()
t0.join()
readTime = datetime.now()
t1 = threading.Thread(target = Read)
t2 = threading.Thread(target = Add)
t1.start()
t2.start()
t1.join()
t2.join()
I would like to stop the execution of the function fun0()
externally i.e. from another function. When the stop occurs, I would also like the functions fun1
, fun2
and processes p1
, p2
to stop and possibly retrieve the data from them. I wonder which would be an elegant, clean and Pythonic way to do it. I am considering to:
- threading,
- multiprocessing,
- using another function,
- using signals?
I have read in this post 28906558 on stopping the function using multiprocessing
should be the way to do it but I would like to hear more opinions, thank you.