2

I have two separate generators (in reality they are fed by two separate zero-mq subscribers).

I would like to consume them from the same event loop.

Something conceptually like this:

import time

def gen_one():
    while True:
        yield 1
        time.sleep(1)

def gen_two():
    while True:
        yield 2
        time.sleep(1.3)

for a in gen_one() or gen_two():
   print a

# would like to see:
# 1
# 2
# 1
# 2
# 1
# 2
# 1
# 1
# 2
# ...

Note that this is in Python 2.7.

Right now I'm obviously getting 1,1,1,1...

I could somehow nest the second iterator into the first one (not looping, but rather checking if there is something to read), but that, at best, would constrain the rate of the inner consumer to the rate of the outer consumer, which is undesirable.

Note that zip() is not a good option, for the same reason as mentioned above, in addition to forcing the two generators to have the same rate, which they don't.

Any suggestions on how to achieve this?

Based on suggestions in comments, something like this might work:

from multiprocessing import Process, Queue
import time

def gen_one(queue):
    while True:
        queue.put(1)
        time.sleep(1)

def gen_two(queue):
    while True:
        queue.put(2)
        time.sleep(1.3)

queue = Queue()
p1 = Process(target=gen_one, args=(queue,)).start()
p2 = Process(target=gen_two, args=(queue,)).start()

while True:
    a = queue.get()
    print a

Which gets the job done.

Not as direct or elegant as I'd like, but definitely not terrible.

sagism
  • 881
  • 4
  • 11
  • 18
  • Would you consider upgrading to Python 3.5+? There are recent developments in the language relevant here. – wim Mar 29 '18 at 05:11
  • https://docs.python.org/3/library/asyncio-task.html is what @wim is referring to, I believe. – ndmeiri Mar 29 '18 at 05:16
  • Not easy to upgrade to Python 3.5+ as the code base is quite large, but would would still be interested in seeing a Python 3.5+ solution (not clear to me from glancing at asyncio how I would implement this). Thanks. – sagism Mar 29 '18 at 05:28
  • There's an asyncio port for Python2 that [apparently](https://twitter.com/gvanrossum/status/545648986409811968?lang=en) has Guido's blessing. However, since you mentioned zeromq, I would believe your actual use case might be more simply solved just by using a queue. Unless you have some particular reason you need to do this all in the main thread, I would just go with plain old threading - despite how unhip that is, it should all work fine on Python 2 for IO bound stuff. – wim Mar 29 '18 at 05:39
  • 1
    Could all generators simply push their products in one joint queue, then the main thread just fetch whatever in queue? – Blownhither Ma Mar 29 '18 at 06:03

1 Answers1

1

Could all generators simply push their products in one joint queue, and the main thread just fetch whatever in queue?

import queue
import threading

q = queue.Queue()
def gen_one():
    while True:
        # yield 1
        q.put(1)
        time.sleep(1)

def gen_two():
    while True:
        # yield 2
        q.put(2)
        time.sleep(1.3)

def main():
    while True:
        x = q.get(block=True)    # block to yield thread
        print(x)

threading.Thread(target=gen_two, daemon=True).start()
threading.Thread(target=gen_one, daemon=True).start()
m = threading.Thread(target=main, daemon=True)
m.start()
m.join()

The output would be like 2 1 1 2 1 2 1 2 1 1 2 1
You may wrap your 'yielding' functions into 'putting' functions. If you do care about the source of products, add tag while putting object in the queue.

Blownhither Ma
  • 1,461
  • 8
  • 18
  • Similar to the one I added on my edit in the original question. And yes, this is a decent solution. your solution uses `threading` vs. mine which uses `multiprocessing`. See https://stackoverflow.com/questions/3044580/multiprocessing-vs-threading-python for differences. – sagism Mar 29 '18 at 08:06