I have a queue processed in separated thread. When I fill jobs from the main thread only the last job is processed multiple time. It works if I block the queue with a join()
in between each put()
but this don't fill my need. In Python 3:
Q = PriorityQueue()
def processQ():
while True :
if not Q.empty():
job = Q.get(False)
job[1]()
Q.task_done()
def fill():
for i in range(0,100):
Q.put(((1, Q.qsize()), lambda: print(i)))
def main():
Thread(target=lambda: processQ()).start()
fill()
And the output is :
99
99
99
99
etc... 100 times
I've read some things about solving it with multiprocessing but this seems very complicated for the simple behavior I want...
Another thing I don't understand is why I have to include Q.qsize()
in the put otherwise a
TypeError: unorderable types: function() < function()
is raised. I had not to do this in Python 2.7
I will be very happy if you could help me
****** EDIT ******
So you CANNOT use lambda function as I did. You HAVE to put function with arguments in the queue with a tuple like this:
for i in range(0,100):
Q.put(((1, Q.qsize()), (print, i)))
def processQ():
while True :
if not Q.empty():
job = Q.get(False)
func = job[1][0] # [0] is the priority, [1:] are the arguments
args = job[1][1:]
func(*args)
Q.task_done()
THE question now is WHY ?