0

I'm making a class that subclasses multiprocessing.Process. When doing some testing, I noticed that the process was not starting when start() was called. After some testing, it appears that the processes do not start until 2 lines of code are executed in the __main__ module.

As an example

import multiprocessing

class Test(multiprocessing.Process):
    def __init__(self, *args, **kwargs):
        super(Test, self).__init__(*args, **kwargs)
        print('created')

    def run(self, *args, **kwargs):
        super(Test, self).run(*args, **kwargs)
        print('running')

sample = Test()
>>> created
sample.start()
pass # Did not start yet
pass # Will start after this
>>> running

I've tested this on different platforms, and it works as expected there. I'm running Windows 10, Python 3.5.2. Output of sys.version is '3.5.2 |Continuum Analytics, Inc.| (default, Jul 5 2016, 11:41:13) [MSC v.1900 64 bit (AMD64)]'

  • Are you always running from a python interactive session? Do you get the same when executing the code as a script from the command-line? – cdarke Feb 10 '17 at 14:44
  • It always appears to be the same, whether its from an interactive session, a script, or importing a module. – DeepHorizons Feb 10 '17 at 14:49

1 Answers1

0

It is known that multiprocessing behaves differently under windows and linux. By other platforms do you mean linux? Multiprocessing forks (with fork()) a new process in linux and the child process gets a copy of all the variables and information it needs, where Windows spawns a new copy of the python interpreter for each process. There is a good answer explaining it here: Python Multiprocess diff between Windows and Linux

So to remedy, make sure that all global variables are explicitly passed to the child processes using shared memory. Check out the multiprocessing docs: https://docs.python.org/dev/library/multiprocessing.html#sharing-state-between-processes for a bit more information.

Community
  • 1
  • 1
Nick H
  • 1,081
  • 8
  • 13