I am using below code to get json filenames in a directory.
import glob
jsonFiles = glob.glob(folderPath+"*.json")
Many new json files get created in the directory per second (say 100/s). Usually it works fine, but gets stuck when no. of files is large (~150000) and takes long (3 - 4 mins) to retrieve filenames. This might be because of large incoming rate (not sure).
Is there any alternative approach to get filenames EFFICIENTLY using python or linux command. Getting oldest 1000 filenames will work too. I don't need all filenames at once.
I came across following shell command:
ls -Art | head -n 1000
Will it help? Does it lists all filenames first, then retrieves 1000 oldest record? Thanks in Advance.