0

I'm trying to use multiprocessing in Python to have a function keep getting called within a loop, and subsequently access the latest return value from the function (by storing the values in a LIFO Queue).

Here is a code snippet from the main program

q = Queue.LifoQueue()
while True:
   p = multiprocessing.Process(target=myFunc, args = (q))
   p.daemon = True
   p.start()
   if not q.empty():
      #do something with q.get()

And here's a code snippet from myFunc

def myFunc(q):
    x = calc()
    q.put(x)

The problem is, the main loop thinks that q is empty. However, I've checked to see if myFunc() is placing values into q (by putting a q.empty() check right after the q.put(x)) and the queue shouldn't be empty.

What can I do so that the main loop can see the values placed in the queue? Or am I going about this in an inefficient way? (I do need myFunc and the main loop to be run separately though, since myFunc is a bit slow and the main loop needs to keep performing its task)

user3543300
  • 499
  • 2
  • 9
  • 27
  • https://stackoverflow.com/questions/33691392/how-to-implement-lifo-for-multiprocessing-queue-in-python – wojtow Oct 16 '21 at 21:52

1 Answers1

1

Queue.LifoQueue is not fit for multiprocessing, only multiprocessing.Queue is, at is it specially designed for this usecase. That means that values put into a Queue.LifoQueue will only be available to the local process, as the queue is not shared between subprocesses.

A possibility would be to use a shared list from a SyncManager (SyncManager.list()) instead. When used with only append and pop, a list behaves just like a lifo queue.

mata
  • 67,110
  • 10
  • 163
  • 162
  • Thanks! I will try out SyncManager.list() and see if it works. Do you think for what I'm trying to do though, multiprocessing is the best way to go? – user3543300 Aug 13 '15 at 07:50