0

I have the following line:

service = discovery.build('sheets', 'v4', credentials=pickle.load(open('google_sheets_token.pickle', 'rb')))

And while it's trivial to implement something like this:

with open('google_sheets_token.pickle', 'rb') as f:
    service = discovery.build('sheets', 'v4', credentials=pickle.load(f)) 

I was wondering if it's possible to automatically close files after being passed to a function?

David542
  • 104,438
  • 178
  • 489
  • 842
  • No, not if you don't keep a reference to it. The idiomatic way is to use a context manager, that's what they are for – juanpa.arrivillaga Jan 22 '20 at 22:10
  • Are you only concerned with files opened for reading, or also writing? – Barmar Jan 22 '20 at 22:20
  • Does this answer your question? [opening & closing file without file object in python](https://stackoverflow.com/questions/22070063/opening-closing-file-without-file-object-in-python) – Georgy Jan 22 '20 at 22:31
  • @Georgy not really. That basically just says to use the approach I've used in the second part of the question: I'm curious if it's possible to not use that at all. – David542 Jan 22 '20 at 22:36

1 Answers1

4

Nope. In the CPython reference interpreter, if no reference cycles occur (they're easier to trigger than you might think, so don't assume anything), the first example will automatically close the file as the reference count drops to 0 immediately after the function in question returns (assuming it didn't save off a reference). But it's not something to rely on; use a with statement if you want guarantees on CPython, and even a hope of the correct behavior on non-reference counted alternate interpreters like PyPy, Jython, IronPython, etc. Without with management, the file will probably eventually close, but there's no guarantees on when it will happen (or even if; it's best effort, not an ironclad guarantee).

ShadowRanger
  • 143,180
  • 12
  • 188
  • 271
  • Even without reference counting, the garbage collector will eventually reclaim it. – Barmar Jan 22 '20 at 22:13
  • @Barmar: The cyclic garbage collector won't reclaim it *deterministically* (it could take milliseconds or hours, and cyclic GC can be turned *off*, making it never happen), and even if it works, it's not 100% guaranteed to close in the right order when it comes to arbitrary objects (the `io` module backing `open` has most of the bugs ironed out, but for a long time, it was possible for the text encoding I/O layer or the buffering I/O to close *after* the raw underlying file I/O layer closed, dropping unflushed data in the buffers; the cyclic GC just didn't know which one to close first). – ShadowRanger Jan 22 '20 at 22:16
  • He didn't specify, but the example in the question is a file opened for reading, so buffer flushing shouldn't be an issue. I definitely would use a context manager for a write file. – Barmar Jan 22 '20 at 22:20
  • @ShadowRanger thanks this is a very helpful answer, much better than the one linked as a duplicate (is every question on SO now just closed as a duplicate based on the title?) – David542 Jan 22 '20 at 22:35
  • @Barmar: Even for a file opened for read, I've run on systems with absurdly low `ulimit`s for open file handles; if you're opening and closing files a lot, and either you're not on CPython, or some of the code paths on CPython are incorporating the file object into a reference cycle (catching exceptions being a common cause, since the exception contains a reference to a traceback that references their frame, which references the exception itself), it's possible to reach the limit before they're collected, and now you can't open new files until a collection cleans out the old ones. – ShadowRanger Jan 23 '20 at 20:37