0

I am trying to start several processes in a class context which should share a queue:

import multiprocessing
import queue

class MyMulti:
    def __init__(self):
        self.myq = queue.Queue()

    def printhello(self):
        print("hello")
        self.myq.put("hello")

    def run(self):
        for _ in range(5):
            p = multiprocessing.Process(target=self.printhello)
            p.start()

if __name__ == "__main__":
    multiprocessing.freeze_support()
    m = MyMulti()
    m.run()
    # at that point the queue is being filled in with five elements

This crashes with

C:\Python34\python.exe C:/Users/yop/dev/GetNessusScans/tests/testm.py
Traceback (most recent call last):
  File "C:/Users/yop/dev/GetNessusScans/tests/testm.py", line 20, in <module>
    m.run()
  File "C:/Users/yop/dev/GetNessusScans/tests/testm.py", line 15, in run
    p.start()
  File "C:\Python34\lib\multiprocessing\process.py", line 105, in start
    self._popen = self._Popen(self)
  File "C:\Python34\lib\multiprocessing\context.py", line 212, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
  File "C:\Python34\lib\multiprocessing\context.py", line 313, in _Popen
    return Popen(process_obj)
  File "C:\Python34\lib\multiprocessing\popen_spawn_win32.py", line 66, in __init__
    reduction.dump(process_obj, to_child)
  File "C:\Python34\lib\multiprocessing\reduction.py", line 59, in dump
    ForkingPickler(file, protocol).dump(obj)
_pickle.PicklingError: Can't pickle <class '_thread.lock'>: attribute lookup lock on _thread failed

An answer to a similar question suggested to have a worker uppermost function, which I adapted to my case as

import multiprocessing
import queue

def work(foo):
    foo.printhello()

class MyMulti:
    def __init__(self):
        self.myq = queue.Queue()

    def printhello(self):
        print("hello")
        self.myq.put("hello")

    def run(self):
        for _ in range(5):
            p = multiprocessing.Process(target=work, args=(self,))
            p.start()

if __name__ == "__main__":
    multiprocessing.freeze_support()
    m = MyMulti()
    m.run()
    # at that point the queue is being filled in with five elements

This crashes the same way, though.

Is there a way to start processes with methods as targets?

Community
  • 1
  • 1
WoJ
  • 27,165
  • 48
  • 180
  • 345

1 Answers1

0

I should have used self.myq = multiprocessing.Queue() instead of queue.Queue().

multiprocessing.Queue() is, in addition of queue.Queue(), process safe.

I leave the question unanswered for now for someone to possibly comment if the whole approach is wrong.

WoJ
  • 27,165
  • 48
  • 180
  • 345