Is there any option to have a multiprocessing Queue where each value can be accessed twice?
My problem is I have one "Generator process" creating a constant flux of data and would like to access this in two different process each doing it's thing with the data.
A minimal "example" of the issue.
import multiprocessing as mp
import numpy as np
class Process1(mp.Process):
def __init__(self,Data_Queue):
mp.Process.__init__(self)
self.Data_Queue = Data_Queue
def run(self):
while True:
self.Data_Queue.get()
# Do stuff with
self.Data_Queue.task_done()
class Process2(mp.Process):
def __init__(self,Data_Queue):
mp.Process.__init__(self)
self.Data_Queue = Data_Queue
def run(self):
while True:
self.Data_Queue.get()
# Do stuff with
self.Data_Queue.task_done()
if __name__ == "__main__":
data_Queue = mp.Queue()
P1 = Process1()
P1.start()
P2 = Process2()
P2.start()
while True: # Generate data
data_Queue.put(np.random.rand(1000))
The idea is that I would like for both Process1 and Process2 to access all generated data in this example. What would happen is that each one would only just get some random portions of it this way.
Thanks for the help!
Update 1: As pointed in some of the questions and answers this becomes a little more complicated for two reasons I did not include in the initial question.
- The data is externally generated on a non constant schedule (I may receive tons of data for a few seconds than wait minutes for more to come)
- As such, data may arrive faster than it's possible to process so it would need to be "Queued" in a way while it waits for its turn to be processed.