0

I want to write a function in Python 3.6 with multiprocessing, where workers execute a function

def f(x):
   .
   .

in such a way that every time an error is raised in the child process, that child process should restart.

This is my code:

for worker in workers:
 def nested(worker):
  try:
   `proc=multiprocessing.Process(target=f,\
     args=(args[worker],))
    proc.start()
   except:
      nested(worker)
 nested(worker)

The problem is that this structure does not catch the error in the child process, so it does not work as intended. Unfortunately, the solutions in Python Multiprocessing: Handling Child Errors in Parent are very specific to the problem in that thread and cannot really be applied here.

Does anyone have an idea how to fix this?

Paul Rousseau
  • 571
  • 2
  • 7
  • 21
  • 1
    Why don't you catch the error in the worker itself, i.e. create a proxy function that will call `f()`, then capture the errors in that proxy function and call `f()` again if needed, and use that proxy function as the `Process` `target`. – zwer Oct 19 '18 at 08:07
  • Good answer. Thanks. – Paul Rousseau Oct 19 '18 at 08:16
  • Get error :AttributeError: Can't pickle local object 'main..nested' Traceback (most recent call last): File "", line 1, in File "C:\Users\Paul\AppData\Local\Programs\Python\Python36\lib\multiprocessing\spawn.py", line 105, in spawn_main exitcode = _main(fd) File "C:\Users\Paul\AppData\Local\Programs\Python\Python36\lib\multiprocessing\spawn.py", line 115, in _main self = reduction.pickle.load(from_parent) EOFError: Ran out of input – Paul Rousseau Oct 19 '18 at 08:36
  • New code looks like this : def nested(args[worker]): try: f(args[worker]) except: nested(args[worker]) for worker in workers: proc=multiprocessing.Process(target=nested,\ args=(args[worker])) proc.start() – Paul Rousseau Oct 19 '18 at 08:40
  • Can you edit your question and post your changes/issues there, it's very difficult to follow it in the comments? – zwer Oct 19 '18 at 09:40
  • Yeah, sorry, I was not sure whether it would be appropriate since the question is now more about function levels (I solved it by putting the proxy function in another file). – Paul Rousseau Oct 19 '18 at 09:47

1 Answers1

0

With the help of @zwer, I got this to work:

def nested(args):
    try:
        f(args)
    except:
        nested(args)

for worker in workers:
    proc=multiprocessing.Process(target=nested,\
                    args=args[worker])
    proc.start()

Just beware that this structure does only work on Unix systems, for Windows, you need to put the function in a separate file.

Paul Rousseau
  • 571
  • 2
  • 7
  • 21