2

How can I find a list of open file objects at a snapshot in time?

Consider a few cases where files can be opened without being closed (either via with or f.close()):

import io
import typing

buf = io.StringIO()
buf.write('memory here')  # still open

f = open('/tmp/file.tmp', 'w')
f.write('more memory')  # still open

def g() -> None:
    open('/tmp/file2.tmp', 'w')  # unclosed file
g()


def h() -> typing.TextIO:
    return open('/tmp/file3.tmp', 'w')
res = h()  # open (res.closed is False)

I am aware that the GC will eventually destroy the objects and close the open files on its own. However, I am interested in manually finding which open files may be using system resources at any given time.

One way it seems is to look in __warningregistry__; is this the defacto way to do such discovery, or is there a better route?

>>> g()
>>> __warningregistry__
{'version': 0, ("unclosed file <_io.TextIOWrapper name='/tmp/file2.tmp' mode='w' encoding='UTF-8'>",
 <class 'ResourceWarning'>, 2): True}
Brad Solomon
  • 38,521
  • 31
  • 149
  • 235

0 Answers0