I would like to know what is most efficient way to test if a large file exists locally (without loading it in memory). If it doesn't exists (or not readable) then download it. The goal is to upload the data in a pandas DataFrame.
I wrote the snippet below which is working (and tested with a small file). What about correctness and pythonic programming?
url = "http://www-bcf.usc.edu/~gareth/ISL/Advertising.csv" # 4.7kB
file = "./test_file.csv"
try:
os.open( file, os.O_RDONLY)
df_data = pd.read_csv( file, index_col=0)
except:
df_data = pd.read_csv( url, index_col=0)
df_data.to_csv( file)