For instance:
def read_file(f):
with open(f, 'r') as file_to_read:
while True:
line = file_to_read.readline()
if line:
yield line
else:
time.sleep(0.1)
The generator is consumed by another function:
def fun_function(f):
l = read_file(f)
for line in l:
do_fun_stuff()
A use case would be reading an infinitely updating text file like a log where new lines are added every second or so.
As far as I understand the read_file()
function is blocking others as long as something is yielded. But since nothing should be done unless a new line is present in the file, this seems to be okay in this case. My question would be if there are other reasons not to prefer this blocking pattern (like performance)?