0

I wonder if is it possibility to share object of my class between multiprocessing.Process in Python? I found few ideas, but it seem's they don't want to work for my types.

Here are my imports

import multiprocessing as mp
import sharedmem as shm
import random

I want to share between processes object of my class, let's say it's MyVector class

class MyVector:
    def __init__(self,x_1,x_2):
        self.x_1 = x_1
        self.x_2 = x_2

    def set_vector(self, x_1, x_2):
        self.x_1 = x_1
        self.x_2 = x_2

    def get_vector(self):
        return self.x_1, self.x_2

For now it looks simple. Next step is to create sharedmem array of proper size and dtype

v_size = 5
shared = shm.empty(v_size, dtype=MyVector)
print("shared", shared)

And my output is

shared [None None None None None]

For me it looks strange now, cause when I change type to float, print will give me result like below

shared [ 0.  0.  0.  0.  0.]

Let's go further. Now I want to add MyVectors to shared and print objects

min_x = 0
max_x = 1
for _ in range(v_size):
    x_1 = random.uniform(min_x, max_x)
    x_2 = random.uniform(min_x, max_x)
    shared[_] = MyVector(x_1, x_2)
for i in range(v_size):
    print(shared[i].get_vector())

Example result looks like

(0.09967776182991428, 0.5409857393838231)
(0.6722157278118476, 0.7321068889697359)
(0.8677334456416979, 0.009142982318455117)
(0.6627846159441471, 0.1627625183127126)
(0.08099900459563925, 0.5904205522643091)

So it looks generally ok. Let's focus on processes. I want to create v_size processes, pass the shared to some_function() and check result

processes = [mp.Process(target=some_function, args=(shared, i, 0, 0)) for i in range(v_size)]
print(processes)
for process in processes:
    process.start()
for process in processes:
    process.join()

def some_function(shared, idx, x_1, x_2):
    print(shared[idx])
    shared[idx] = MyVector(x_1, x_2)
    print(shared[idx])

My idea was to change every object of MyVector to (0,0), but it seems that in some_function() object at idx position is None. When I'm trying to create MyVector inside some_function() and change on idx position, it's working only inside of some_function(), so it seems that every process that I'm starting creates copy of it (or copy of some None objects (??))

What is strange, I can't see vsize processes, but only 1.

Do you have any idea how I can achieve that?

Tatarinho
  • 754
  • 2
  • 11
  • 31
  • From quickly looking at the [`sharedmem` documentation](http://rainwoodman.github.io/sharedmem/) I get the impression that the package manages its own pool of workers. What gave you the idea it can be used with `multiprocessing`? – MB-F Nov 16 '17 at 08:40
  • I found something similar at https://stackoverflow.com/questions/17785275/share-large-read-only-numpy-array-between-multiprocessing-processes – Tatarinho Nov 16 '17 at 08:46
  • I see. That answer is quite old.. could be that something changed in the meantime. But I rather think the problem is that you are trying to share an array of Python objects. I don't know for sure, but this sounds like it could cause trouble. The array does not contain the objects themselves but only references. So while the references are shared, the objects are not. Does that make sense? – MB-F Nov 16 '17 at 08:59
  • Maybe. I don't understand why in Process it's None, not even copied reference. Do you have any idea how can I achieve object sharing beetween processes? – Tatarinho Nov 16 '17 at 09:38
  • I have no idea how to share Python objects. It may be impossible because a Python object is under the hood a jumble of C pointers which have no meaning across processes. – MB-F Nov 16 '17 at 10:27

0 Answers0