0

I am not totally clear on how closures work in terms of thread-safety in Python, especially in regards to generators. For example, if I call a function from multiple threads (or multiple processes via multiprocessing) that looks something like this:

def count_to_ten():
    def counter_gen():
        for i in range(10):
            yield i
    return [i for i in counter_gen()]

Will the lists of numbers returned to my threads be [0, 1, 2, 3, ..., 9] or will there be race conditions, and I will end up with threads getting different results?

Clearly the following will not be thread safe:

def counter_gen():
    for i in range(10):
        yield i

But if it is wrapped up in a function and defined as a closure (like the first code snippet), is it thread-safe?

MaxStrange
  • 137
  • 1
  • 10
  • Why would the second version be unsafe? As long as you don't share the returned generators, it should be perfectly fine. – user2357112 Mar 23 '17 at 23:55
  • Calls to `counter_gen` from different threads won't share local variables or anything like that. Each generator gets its own `i`. – user2357112 Mar 23 '17 at 23:58
  • @user2357112 https://stackoverflow.com/questions/1131430/are-generators-threadsafe . I don't want to assign the generator method to a local variable and then call it like that; I want to call a function from multiple threads, and that function will have a generator closure. – MaxStrange Mar 24 '17 at 00:19
  • The guy in that link is sharing the generator iterator - as in, the thing the generator function returns. As long as you're only sharing the function and not the returned iterators, you're fine. – user2357112 Mar 24 '17 at 01:00

0 Answers0