1

The following program does the following things:

  1. Parent process creates an inter-process shared value of data type SHARED_DTYPE
  2. Parent process creates inter-process queue to pass object from child process to parent process.
  3. Parent process spawns child process (and waits for object to arrive via the inter-process queue).
  4. Child process modifies the value of the inter-process shared value
  5. Child process creates an object of data type TRAVELLER_DTYPE
  6. Child process passes the created object via the inter-process queue.
  7. Parent process receives the object via the inter-process queue.
from multiprocessing import Value, Process, Queue
import ctypes

SHARED_DTYPE = ctypes.c_int
TRAVELLER_DTYPE = ctypes.c_float

shared_value = Value(SHARED_DTYPE, 0)
print('type of shared_value =', type(shared_value))
print('shared_value =', shared_value.value)

def child_proc():
    try:
        shared_value.value = 1
        obj = TRAVELLER_DTYPE(5)
        print('send into queue =', obj)
        q.put(obj)
    except BaseException as e:
        print(e)
    finally:
        print('child_proc process is finished')

if __name__ == "__main__":
    try:
        q = Queue()
        cp = Process(target=child_proc)
        cp.start()
        cp.join()

        print('shared_value =', shared_value.value)
        obj = q.get()
        print('recv from queue =', obj)
    except BaseException as e:
        print(e)
    finally:
        print('__main__ process is finished')

Now, if the above program is run, it works correctly, giving the following output:

type of shared_value = <class 'multiprocessing.sharedctypes.Synchronized'>
shared_value = 0
send into queue = c_float(5.0)
child_proc process is finished
shared_value = 1
recv from queue = c_float(5.0)
__main__ process is finished

But if we change the TRAVELLER_DTYPE to ctypes.c_int at the top of the program, it no longer works correctly.

Sometimes, it gives the following output:

type of shared_value = <class 'multiprocessing.sharedctypes.Synchronized'>
shared_value = 0
send into queue = c_int(5)
child_proc process is finished
shared_value = 1
^C                               <-- Pressed ctrl-C here, was hung indefinitely.
__main__ process is finished

While other times, it gives this output:

type of shared_value = <class 'multiprocessing.sharedctypes.Synchronized'>
shared_value = 0
send into queue = c_int(5)
child_proc process is finished
Traceback (most recent call last):
  File "/usr/lib/python3.8/multiprocessing/queues.py", line 239, in _feed
    obj = _ForkingPickler.dumps(obj)
  File "/usr/lib/python3.8/multiprocessing/reduction.py", line 51, in dumps
    cls(buf, protocol).dump(obj)
  File "/usr/lib/python3.8/multiprocessing/sharedctypes.py", line 129, in reduce_ctype
    assert_spawning(obj)
  File "/usr/lib/python3.8/multiprocessing/context.py", line 359, in assert_spawning
    raise RuntimeError(
RuntimeError: c_int objects should only be shared between processes through inheritance
shared_value = 1
^C                            <-- Pressed ctrl-C here, was hung indefinitely.
__main__ process is finished

Why?

In general, the program works correctly if and only if SHARED_DTYPE != TRAVELLER_DTYPE

Is some explicit locking object required?


The Python multiprocessing doc page does not mention any such issue.

Searching online the error message does not give any relevant info/lead:

Mark Tolonen
  • 166,664
  • 26
  • 169
  • 251

1 Answers1

1

Odd that it works when the two types are not the same, but fails when they are the same. The bug report mentioned looks relevant but old. This does appear to be a bug. A workaround is that unlike Value objects, Queue objects do not need to be (and perhaps shouldn't) be ctypes types, so you can use int and float instead and it works.

I assume you are running on Linux, but on Windows it uses spawning vs. forking of processes, and with spawning the script is imported into child processes, making global variables different instances between processes. This makes even your "working" scenario fail on Windows. Instead, the queue and shared value should be passed as arguments to the child worker ensuring they are inherited correctly as the same object (this may be what then error message is referring to).

Below I've also rearranged the code to work with spawning so it will work on Windows as well as Linux:

from multiprocessing import Value, Process, Queue
import ctypes

SHARED_DTYPE = ctypes.c_int
TRAVELLER_DTYPE = int

def child_proc(q,shared_value):
    shared_value.value = 1
    obj = TRAVELLER_DTYPE(5)
    print('send into queue =', obj)
    q.put(obj)
    print('child_proc process is finished')

if __name__ == "__main__":
    shared_value = Value(SHARED_DTYPE, 0)
    print('type of shared_value =', type(shared_value))
    print('shared_value =', shared_value.value)
    q = Queue()
    cp = Process(target=child_proc,args=(q,shared_value))
    cp.start()
    cp.join()

    print('shared_value =', shared_value.value)
    obj = q.get()
    print('recv from queue =', obj)
    print('__main__ process is finished')
type of shared_value = <class 'multiprocessing.sharedctypes.Synchronized'>
shared_value = 0
send into queue = 5
child_proc process is finished
shared_value = 1
recv from queue = 5
__main__ process is finished
Mark Tolonen
  • 166,664
  • 26
  • 169
  • 251
  • Hi @MarkTolonen, thanks for answering. Indeed I am running Linux and the code does not need to support any other platform, still thanks for pointing out the issue, also in my actual program the issue occurs with `TRAVELLER_DTYPE = bytearray` and `SHARED_DTYPE = c_int` currently changing c_int to c_uint resolves the conflict with bytearray. I would like to investigate more on this issue but unforunately time does not permit that. – Rishabh poddar Jan 08 '21 at 08:45