10

I am using python multiprocessing to split one of the longer processes and run parallelly. It is working fine except when there is an exception in one of the child processes, in which case, process pool is not closed and I can still see those processes on the server.

Here is the code:

from multiprocessing import Pool
pool = Pool(processes=4)
from functools import partial
param_data = "Test Value"
func = partial(test_function, param_data)
r = pool.map(func, range(3))
pool.close()

def test_function(param_data,index):
   try:
      # The process here;
   except Exception as e:
      # Close the process pool;

On giving pool.close inside the except block it says

NameError: global name 'pool' is not defined

I tried to kill the process on Exception with the following code.

    except Exception as e:
       import os
       import signal
       pid = os.getpid()
       os.kill(pid, signal.SIGTERM) 

But I can still see the process on the server. This still not the best solution as this will only terminate the child process which encountered an exception other processes will still go on.

I want all processes to terminate on completion, irrespective of if they encounter with an exception or not.

I am running Python2.7

Ps: I cannot install a new library like psutil on the server, I am trying for a solution using standard python library.

I checked few similar questions such as, auto kill process and child in this forum but they were not really this issue.

Codeformer
  • 2,060
  • 9
  • 28
  • 46
  • 1
    Did you try to get the results with `r.get()` before calling `pool.close()`? I think the processes won't exit before you get their results after calling `pool.map()`. Oh, and you shouldn't try to touch the `pool` object from child processes, as documentation says: `Note that the methods of a pool should only ever be used by the process which created it.` – Maciek Jun 16 '17 at 11:54
  • Thank you @Maciek , I got the solution by putting the whole parent process code within try ... except block. Shared it as an answer below. – Codeformer Jun 16 '17 at 13:02

2 Answers2

7

I got the solution for this - Catch exception at the parent process.

try:
   pool = Pool(processes=4)
   from functools import partial
   param_data = "Test Value"
   func = partial(test_function, param_data)
   r = pool.map(func, range(3))
except Exception as e:
   pool.close()
pool.close()

And also add a try/catch in the child process function:

def test_function(param_data,index):
    try:
       # The process here;
    except Exception as e:
       raise Exception(e.message)          

The except block here looks redundant but it is not. The reason is, some of the exceptions raised by child process were not thrown to the parent process.

For example, I was raising a custom exception in the function and it was not thrown to the parent process, So it is advised to catch all exceptions within the child process and raise the standard Exception from there, Then handle it at parent process, where you can close the process pool or do other cleanups.

bagage
  • 1,094
  • 1
  • 21
  • 44
Codeformer
  • 2,060
  • 9
  • 28
  • 46
  • I think that `pool.close()` is not enough to free also resources. After you should call `pool.terminate()` – decadenza Jun 14 '18 at 09:46
2

You have to use: del Pool after Pool.terminate & Pool.join

This will solve your problem

Anchal Gupta
  • 219
  • 1
  • 9