Parsing output of ls
to iterate through list of files is bad. So how should I go about iterating through list of files in order by which they were first created? I browsed several questions here on SO and they all seem to parsing ls
.
The embedded link suggests:
Things get more difficult if you wanted some specific sorting that only
ls
can do, such as ordering bymtime
. If you want the oldest or newest file in a directory, don't usels -t | head -1
-- read Bash FAQ 99 instead. If you truly need a list of all the files in a directory in order by mtime so that you can process them in sequence, switch to perl, and have your perl program do its own directory opening and sorting. Then do the processing in the perl program, or -- worst case scenario -- have the perl program spit out the filenames with NUL delimiters.Even better, put the modification time in the filename, in YYYYMMDD format, so that glob order is also mtime order. Then you don't need ls or perl or anything. (The vast majority of cases where people want the oldest or newest file in a directory can be solved just by doing this.)
Does that mean there is no native way of doing it in bash
? I don't have the liberty to modify the filename to include the time in them. I need to schedule a script in cron
that would run every 5 minutes, generate an array containing all the files in a particular directory ordered by their creation time and perform some actions on the filenames and move them to another location.
The following worked but only because I don't have funny filenames. The files are created by a server so it will never have special characters, spaces, newlines etc.
files=( $(ls -1tr) )
I can write a perl
script that would do what I need but I would appreciate if someone can suggest the right way to do it in bash
. Portable option would be great but solution using latest GNU utilities will not be a problem either.