Am I doing something wrong or is finding the most recent file at a file path location supposed to be fairly slow?
The below code is taking upwards of 3 minutes. Is this expected for parsing thru a list of ~850 files?
I am using a regex pattern to find only .txt files and so after searching thru my file share location it returns a list of ~850 files. This is the list it parses thru to get the max(File) by key=os.path.getctime I tried sort instead of max and to just grab the top file but that wasn't any faster.
import os
import glob
path='C:\Desktop\Test'
fileRegex='*.*txt'
latestFile = get_latest_file(filePath, fileRegex)
def get_latest_file(path,fileRegex):
fullpath = os.path.join(path, fileRegex)
list_of_files = glob.iglob(fullpath, recursive=True)
if not list_of_files:
latestFile=''
latestFile = max(list_of_files, key=os.path.getctime)
return latestFile