1

I have an issue reading a multiprocessing queue the function for reading the queue is being called from another module.

below is the class containing the function to start a thread which runs function_to_get_data. The class resides in its own file, which I will call one.py. function_to_get_data is in another file, two.py and is an infinite loop which puts data into the queue (code snippet for this further down). It also contains the function to read the queue. The Queue q is defined globally at the beginning.

import multiprocessing
from two import function_to_get_data

q = multiprocessing.Queue()   

class Poller:
    def startPoller(self):
        pollerThread = multiprocessing.Process(target=module_to_get_data,args=(q,))
        pollerThread.start()

    def getPoller(self):
        if q.empty():
            print "queue is empty"
        else:
            pollResQueue = q.get()
            q.put(pollResQueue)
            return pollResQueue

if __name__ == "__main__":
    startpoll = Poller()
    startpoll.startPoller()

Below is snippet from function_to_get_data:

def module_to_get_data(q):
    while 1:
        # performs actions #
        q.put(data_from_actions)

I have a another module, three.py, which requires the data from the queue and requests it by calling the function from the initial class:

from one import Poller
externalPoller = Poller()
data_this_module_needs = externalPoller.getPoller()

The issue is that the Queue is always empty.

I should add that the function in three.py is also called as a thread in one.py by a post from a web page:

def POST(data):
    data = web.input()
    if data == 'Start':
        thread_two = multiprocessing.Process(target= function_in_three_py, args=(q,))
        thread_two.start() 

If I use the python command line and enter the two Poller functions and call them, I get data from the queue no problem.

StuFish
  • 31
  • 9
  • Can you explain how you're testing this? Are you just executing three.py? Because when you do that, `startPoller` isn't getting called, so nothing is ever going to get put into the `Queue`. If you're executing `one.py` separately from `three.py`, each script is going to have its own separate copy of the `Queue`. `multiprocessing` is generally meant to be used in the context of one parent script. If you want to share a `Queue` between two separate scripts, you need to use a remote `multiprocessing.Manager`. – dano Jun 04 '15 at 17:07
  • Also, you use both `function_to_get_data` and `module_to_get_data` in your example, where you probably just want one or the other. And the `Process` constructor is taking `arg(q,)`, where you probably meant `args=(q,)`. – dano Jun 04 '15 at 17:08
  • added further info on how `three.py` is run. `one.py` is run initially which kicks off `two.py`; `three.py` is called at an arbitary time from a web POST. dano, I think what you are describing regarding `multiprocessing.Manager` might be the key. – StuFish Jun 04 '15 at 17:15
  • What OS are you running this on? – dano Jun 04 '15 at 17:17
  • I'm running on Ubuntu – StuFish Jun 04 '15 at 18:04
  • looks like this might be what I need [here](http://stackoverflow.com/questions/1829116/how-to-share-variables-across-scripts-in-python/14700365#14700365) – StuFish Jun 04 '15 at 19:47

0 Answers0