26

I think this question is more of a "coding style" rather than technical issue.

Said I have a line of code:

buf = open('test.txt','r').readlines()
...

Will the file descriptor automatically close, or will it stay in the memory? If the file descriptor is not closed, what is the prefer way to close it?

Patrick
  • 4,186
  • 9
  • 32
  • 45

5 Answers5

45

If you assign the file object to a variable, you can explicitly close it using .close()

f = open('test.txt','r')
buf = f.readlines()
f.close()

Alternatively (and more generally preferred), you can use the with keyword (Python 2.5 and greater) as mentioned in the Python docs:

It is good practice to use the with keyword when dealing with file objects. This has the advantage that the file is properly closed after its suite finishes, even if an exception is raised on the way. It is also much shorter than writing equivalent try-finally blocks:

>>> with open('test.txt','r') as f:
...     buf = f.readlines()
>>> f.closed
True
sahhhm
  • 5,325
  • 2
  • 27
  • 22
  • 1
    "buf" in this case will be a list - no "close" method on it. – jsbueno Jan 05 '11 at 01:23
  • 3
    `with` is available in 2.5, but you need to import it using `from __future__ import with_statement`. This is not necessary from 2.6 onward. – Jay Conrod Jan 05 '11 at 01:27
  • @jsbueno thanks for catching my mistake! I updated the answer to reflect your observation! @Jay Conrod, thanks for the information! (I jumped from 2.4 to 2.7, so I never experienced that!) – sahhhm Jan 05 '11 at 01:29
17

Usually in CPython, the file is closed right away when the reference count drops to zero (although this behaviour is not guaranteed for future versions of CPython)

In other implementations, such as Jython, the file won't be closed until it is garbarge collected, which can be a long time later.

It's poor style to have code that works differently depending on the implementation's behaviour.

If it's just for a quickie script or something you are trying in the interpreter shell it's good enough, but for any sort of production work you should usually use a context manager as in Falmarri's answer

John La Rooy
  • 295,403
  • 53
  • 369
  • 502
  • 1
    An example of this bitting the coder in different Python implementations was that Mercurial would exhaust the number of open files allowed by the OS when run on PyPy. Mercurial is undergoing changes to fix this. – TryPyPy Jan 05 '11 at 02:33
6

It will stay in memory until the garbage collector closes it. You should always explicitly close your file descriptors. Just do something like this:

with open('test.txt', 'r') as f:
    buf = f.readlines()
Falmarri
  • 47,727
  • 41
  • 151
  • 191
  • @Hugh Bothwell: Hmm. I guess it depends on your definition of explicitly. I would call it explicitly closing it. – Falmarri Jan 05 '11 at 05:17
  • 1
    It is more explicit so far as that we now know exactly when the file will get closed, but it's not really what I would call explicit in the programming sense. – John La Rooy Jan 05 '11 at 20:27
  • 2
    I would call that *automatically* closing the file handle – Jasen Feb 27 '18 at 00:56
2

It will be automatically closed, but it depends on implementation exactly when. It's nicer to explicitly use a with-block, but if you are just writing a small script for yourself that you run occasionally it doesn't really matter.

Lennart Regebro
  • 167,292
  • 41
  • 224
  • 251
  • Size of script and frequency of running are sublimely irrelevant. If the script iterates over many files opening them, it really does matter if it's not closing them. Sooner or later it will run out of handles and crash. – John Machin Jan 05 '11 at 07:54
  • 1
    You are right, I missed writing the important part: That it was for himself and occasionally. Because then, when he runs out of file handles, he'll notice. ;) But if he put this at a customer site. :-/ – Lennart Regebro Jan 05 '11 at 07:58
1

You can also try using os.close(fd) method.

jsa
  • 369
  • 2
  • 7