Working with python3
, I had a requirement:
- Perform some pre-work
- Do the core work
- Cleanup the pre-work
Taking inspiration from fixtures
in pytest
I came across this post and wrote some crazy code.
Though this crazy code works, I wish to understand the yield
sorcery that makes it working :)
def db_connect_n_clean():
db_connectors = []
def _inner(db_obj):
db_connectors.append(db_obj)
print("Connect : ", db_obj)
yield _inner
for conn in db_connectors:
print("Dispose : ", conn)
This is the driver code:
pre_worker = db_connect_n_clean()
freaky_function = next(pre_worker)
freaky_function("1")
freaky_function("2")
try:
next(pre_worker)
except:
pass
It produces this output:
Connect : 1
Connect : 2
Dispose : 1
Dispose : 2
Traceback (most recent call last):
File "junk.py", line 81, in <module>
next(pre_worker)
StopIteration
What confuses me in this code is, that all the calls to the same generator freaky_func
is maintaining a single list of db_connectors
After the first yield
, all the objects are disposed and I hit StopIteration
I was thinking that calling freaky_func
twice would maintain 2 separate lists and there would be 2 separate yields
Update: The goal of this question is not to understand how to achieve this. As it is evident from the comments, context-manager
is the way to go. But my question is to understand how this piece of code is working. Basically, the python side of it.