I want to remove a file with os.remove(), and then do some work on the remaining files in the directory. However, I find that os.listdir() still includes erased files when they grow beyond a certain size. "Ok", I thought, "os.remove() just works asynchronously. No big deal, I´ll just use os.path.isfile() to check if the file has been completely removed yet". This turned out not to work. The following code exemplifies the problem:
import os
with open("test/test.txt", 'w') as file:
for _ in range(100):
file.write("spam")
print os.path.isfile("test/test.txt")
print os.listdir("test/")
os.remove("test/test.txt")
print os.path.isfile("test/test.txt")
print os.listdir("test/")
This creates a small file of 400 bytes. The output is as expected:
True
['test.txt']
False
[]
But when the number of "spam"s written is increased to 10 000 000 (a 40Mb file), the following output occurs:
True
['test.txt']
False
['test.txt']
So, isfile() is quite aware that the file has been erased, but listdir() hasn´t caught on yet.
Is there a more robust way of checking if a file exists, that will always agree with a following listdir() call?
Tested with Python 2.7 on Windows 7, should it matter.
----EDIT
I have no intention to open any files right away; I want to display all files remaining in the directory in a listbox. I feel opening every file to check if it´s there is uncalled for, but perhaps that is the pythonic way of doing things?