Because sometimes it's more practical than designing a solution around queues, I would like to write a simple wrapper to make an iterator thread safe. So far, I had inspiration from these topics and came up with two ideas:
Idea 1
class LockedIterator(object):
def __init__(self, it):
self._lock = threading.Lock()
self._it = it.__iter__()
if hasattr(self._it, 'close'):
def close(self):
with self._lock:
self._it.close()
self.__setattr__('close', close)
def __iter__(self):
return self
def next(self):
with self._lock:
return self._it.next()
What I don't like about it, is that it gets a bit lengthy if I have to specify all possible methods - okay, I can't - such as the special case for generators. Also, I might have some other iterator with even more specific methods that have now become hidden.
Idea 2
class LockedIterator(object):
def __init__(self, it):
self._lock = threading.Lock()
self._it = it.__iter__()
def __getattr__(self, item):
attr = getattr(self._it, item)
if callable(attr):
def hooked(*args, **kwargs):
with self._lock:
return attr(*args, **kwargs)
setattr(self, item, hooked)
return hooked
This is more concise, but it can only intercept calls, and not, for example, direct property changes. (Those properties are now hidden to prevent problems.) More importantly, it makes it so that Python does no longer recognize my object as an iterator!
What is the best way of making this work for all iterators (or even better: all objects), without creating a leaky abstraction? I'm not too worried about locking when it's not necessary, but if you can come up with a solution that circumvents that, great!