-1

Is there a simple way of making a process terminate if its alive status remains TRUE for a given timeout?

I am using Python multiprocessing module to launch processes in several cores. Sometimes those processes do not succeed and get stuck without doing anything, but with an alive status (is_alive()=TRUE).

jobs    = []                                # this list will contain all jobs

for i in studies:                           # we will call as many processes as elements in studies     
    arguments = (i)                         # my arguments
    p = multiprocessing.Process(target = myprocess, args = arguments)  
    jobs.append(p)                          # list of jobs
    p.start()                               # start process

I am looking for something that substitutes p.start() with something like:

    p.start_with_timeout(t=mytime)

Thanks!

Gerbender
  • 1
  • 1

1 Answers1

-1

I have managed to solve my problem in the following way:

Originally I run studies. Each study is a separate process. My problem is that I want those processes to 'die' by themselves after a certain timeout. I cannot do p.join(timeout) followed y p.terminate() for each process since that would get in the main loop and delay the starting of a new process until the previous is not alive.

The solution is to introduce a hierarchy where p.start() does not start directly the process I am interested in ('myprocess'), but another one that takes care of the killing. I have called it 'dordie':

jobs    = []                                # this list will contain all jobs

for i in studies:                           # we will call as many processes as elements in studies     
    arguments = (i)                         # my arguments
    p = multiprocessing.Process(target = dordie, args = arguments)  
    jobs.append(p)                          # list of jobs
    p.start()                               # start process

'dordie' takes care of launching each 'myprocess' and killing each of them outside the 'for' loop above, which ensures that no matter how many processes are launched, they will always die earlier than the timeout which is hardcoded inside it, but at the same time, nothing prevents more processes from being launched.

def dordie(i):

    arguments = (i)
    timeout   = 600

    p = multiprocessing.Process(target=myprocess, args=arguments)
    p.start()
    p.join(timeout)
    if p.is_alive():
        p.terminate()

I am not a programmer, so maybe this is not an elegant solution, though it is working like a wonder! Thanks anyway for your suggestions.

Gerbender
  • 1
  • 1