5

Below is my python script.

import multiprocessing
# We must import this explicitly, it is not imported by the top-level
# multiprocessing module.
import multiprocessing.pool
import time

from random import randint


class NoDaemonProcess(multiprocessing.Process):
    # make 'daemon' attribute always return False
    def _get_daemon(self):
        return False
    def _set_daemon(self, value):
        pass
    daemon = property(_get_daemon, _set_daemon)

# We sub-class multiprocessing.pool.Pool instead of multiprocessing.Pool
# because the latter is only a wrapper function, not a proper class.
class MyPool(multiprocessing.pool.Pool):
    Process = NoDaemonProcess

def sleepawhile(t):
    print("Sleeping %i seconds..." % t)
    time.sleep(t)
    return t

def work(num_procs):
    print("Creating %i (daemon) workers and jobs in child." % num_procs)
    pool = multiprocessing.Pool(num_procs)

    result = pool.map(sleepawhile,
        [randint(1, 5) for x in range(num_procs)])

    # The following is not really needed, since the (daemon) workers of the
    # child's pool are killed when the child is terminated, but it's good
    # practice to cleanup after ourselves anyway.
    pool.close()
    pool.join()
    return result

def test():
    print("Creating 5 (non-daemon) workers and jobs in main process.")
    pool = MyPool(20)

    result = pool.map(work, [randint(1, 5) for x in range(5)])

    pool.close()
    pool.join()
    print(result)

if __name__ == '__main__':
    test()

This is running in ubuntu server and i'm using python 3.6.7

I had this working properly after apt-get upgrade Im getting error as

group argument must be None for now

What might be the error that I'm facing. Should i change the python version. Should I roll back the changes after upgrading.

EDIT 1

Stacktrace exception:-

Traceback (most recent call last):
  File "/src/mainapp.py", line 104, in bulkfun
    p = MyPool(20)
  File "/usr/lib/python3.6/multiprocessing/pool.py", line 175, in __init__
    self._repopulate_pool()
  File "/usr/lib/python3.6/multiprocessing/pool.py", line 236, in _repopulate_pool
    self._wrap_exception)
  File "/usr/lib/python3.6/multiprocessing/pool.py", line 250, in _repopulate_pool_static
    wrap_exception)
  File "/usr/lib/python3.6/multiprocessing/process.py", line 73, in __init__
    assert group is None, 'group argument must be None for now'
AssertionError: group argument must be None for now

EDIT 2

The code works for python2.7, python3.5 But if i run with python 3.6.7 i got the error as below.

Creating 5 (non-daemon) workers and jobs in main process.
Traceback (most recent call last):
  File "multi.py", line 52, in <module>
    test()
  File "multi.py", line 43, in test
    pool = MyPool(5)
  File "/usr/lib/python3.6/multiprocessing/pool.py", line 175, in __init__
    self._repopulate_pool()
  File "/usr/lib/python3.6/multiprocessing/pool.py", line 236, in _repopulate_pool
    self._wrap_exception)
  File "/usr/lib/python3.6/multiprocessing/pool.py", line 250, in _repopulate_pool_static
    wrap_exception)
  File "/usr/lib/python3.6/multiprocessing/process.py", line 73, in __init__
    assert group is None, 'group argument must be None for now'
AssertionError: group argument must be None for now
Mohamed Ahmed
  • 91
  • 1
  • 6
  • Please could you paste the stacktrace of the exception. – Gary van der Merwe Oct 23 '18 at 12:45
  • @GaryvanderMerwe See the stacktrace of the exception. – Mohamed Ahmed Oct 24 '18 at 05:08
  • That stacktrace does not match your example - try running the code you've posted as an example and see if you encounter the same problem. On another note, are you really going to run this on a 20 x 1-5 CPU system with workers fully utilizing all CPU resources to justify the multiprocessing overhead and the convoluted nature of it? – zwer Oct 24 '18 at 05:53

2 Answers2

3

I came across to this issue while upgrading Travis distribution from 14.04 to 16.04 and python 3.6 started to fail. I have found a solution to this problem as it was a fix to another package - FIX: Python 2.7-3.7.1 compatible NonDaemonPool

class NonDaemonPool(multiprocessing.pool.Pool):
    def Process(self, *args, **kwds):
        proc = super(NonDaemonPool, self).Process(*args, **kwds)

        class NonDaemonProcess(proc.__class__):
            """Monkey-patch process to ensure it is never daemonized"""
            @property
            def daemon(self):
                return False

            @daemon.setter
            def daemon(self, val):
                pass

        proc.__class__ = NonDaemonProcess
        return proc
Jirka
  • 1,126
  • 6
  • 25
1

same here.
This code worked in my case (python 3.6.7). (https://stackoverflow.com/a/53180921/10742388)

class NoDaemonProcess(multiprocessing.Process):
    @property
    def daemon(self):
        return False

    @daemon.setter
    def daemon(self, value):
        pass


class NoDaemonContext(type(multiprocessing.get_context())):
    Process = NoDaemonProcess

# We sub-class multiprocessing.pool.Pool instead of multiprocessing.Pool
# because the latter is only a wrapper function, not a proper class.
class MyPool(multiprocessing.pool.Pool):
    def __init__(self, *args, **kwargs):
        kwargs['context'] = NoDaemonContext()
        super(MyPool, self).__init__(*args, **kwargs)

I think this problem comes from the change of process.py (https://github.com/python/cpython/blob/8ca0fa9d2f4de6e69f0902790432e0ab2f37ba68/Lib/multiprocessing/process.py#L189)

MarshTech
  • 11
  • 2