0

(Disclainer: I'm not expecting someone to tell me exactly what is wrong, just some ways that might make the process of finding it out faster) I have a very complex function that runs as a separate process (some of the functions it calls are inside C++ extensions), and it hangs when it exits. The code looks like this:

def process_to_run(queue, some_complex_object):
    do_stuff(some_complex_object)
    queue.put(some_generated_data)

if __name__ == "__main__":
    some_complex_objects = getSomeComplexObjects() # returns a list of objects
    queue = multiprocessing.Queue()
    processes = []
    for i in range(4):
        processes.append(multiprocessing.Process(target=process_to_run, args=(queue, some_complex_objects[i]))
    for process in processes:
        process.start()
    results = []
    for i in range(4):
        results.append(queue.get())
    for process in processes:
        process.join()

The problem is that do_stuff hangs without throwing any exceptions. So far I have tried killing the process with os or psutil but neither worked (the process keeps hanging even after I call kill()). I have also tried to wrap it in a try except with traceback as suggested here :

def do_stuff():
    try:
        # some code
    except Exception:
        import traceback
        traceback.print_exec()
    print("end")

however it does not print anything except end. I have also tried sys.exc_info() but the only thing I got from that is None.

So my questions is:

  1. is it possible that the process is hanging without anything that raises exceptions?
  2. How can I tell what is causing it to hang?

Thanks

qwerty_99
  • 640
  • 5
  • 20
  • There are a bunch of good solutions for this [here](https://stackoverflow.com/questions/3443607/how-can-i-tell-where-my-python-script-is-hanging). However, it gets more complicated when your program is large and you are running as a module. In this instance, I would favour prints until you find the issue. You could be hitting a [deadlock](https://sopython.com/canon/82/programs-using-multiprocessing-hang-deadlock-and-never-complete/) in multiprocessing. – forgetso Aug 04 '20 at 16:39
  • You can also try `p.close()` on each of your processes before you join them. From the [docs](https://docs.python.org/2/library/multiprocessing.html#module-multiprocessing.pool) "One must call close() or terminate() before using join()." – forgetso Aug 04 '20 at 16:43
  • thanks, im gonna try that – qwerty_99 Aug 04 '20 at 17:23

0 Answers0