I want to create 2 modules (call them A and B): A writes values to the list, B reads those values from that list. First module is constantly run and implemented using multiprocessing
My idea was to create a new module C that will be imported by A and B. A will then store values in a list from C. B then accesses the list from C whenever it needs.
The main reason for having that shared list is that I don't want A to directly talk to B.
So this is an example of code I have so far
#a.py
import c
from multiprocessing import Process
def foo():
while True:
c.add('value')
def run():
p = Process(target=foo)
p.start()
p.join()
if __name__ == '__main__':
run()
#b.py
import c
print c.list_of_values
#c.py
list_of_values = []
def add(val):
list_of_values.append(val)
As far as I understand the problem is that when imported, it's a completely new instance of module C. Can I somehow make it shared? The third module is not necessary if I can avoid it by having that list_of_values in A with a way for B to retrieve it. I tried that but also got nowhere. I can have a dependency on A in B, so B could initiate A and run it. Any hints?
If I can do something like that, it also works for me:
#b.py
import a
Process(target=a.run).start()
print a.list_of_values
I think my problem with that was that Process is run encapsulated, and is not affecting global variables. I then got lost with multiprocessing's Managers and Queues :)