0

Am I doing something wrong or is finding the most recent file at a file path location supposed to be fairly slow?

The below code is taking upwards of 3 minutes. Is this expected for parsing thru a list of ~850 files?

I am using a regex pattern to find only .txt files and so after searching thru my file share location it returns a list of ~850 files. This is the list it parses thru to get the max(File) by key=os.path.getctime I tried sort instead of max and to just grab the top file but that wasn't any faster.

import os
import glob

path='C:\Desktop\Test'
fileRegex='*.*txt'
latestFile = get_latest_file(filePath, fileRegex)

def get_latest_file(path,fileRegex):
    fullpath = os.path.join(path, fileRegex)
    list_of_files = glob.iglob(fullpath, recursive=True)

    if not list_of_files:               
        latestFile=''   

    latestFile = max(list_of_files, key=os.path.getctime)
    return latestFile
CorgiGeek
  • 15
  • 2
  • Does this answer your question? [How to get the latest file in a folder?](https://stackoverflow.com/questions/39327032/how-to-get-the-latest-file-in-a-folder) – PacketLoss May 06 '21 at 15:59
  • @PacketLoss - unfortunately no. that is just pointing me to the code I have but doesn't explain why it would be so slow. ~3minutes to find the latest file – CorgiGeek May 06 '21 at 16:23

1 Answers1

1

Try using os.scandir(), this speeded up my file searching massively.

Flair
  • 2,609
  • 1
  • 29
  • 41
Martin
  • 11
  • 1