You would think that looping over files would be easy, right? But this is full of pitfalls in bash.
Using globs is the WORST. Trust me, don't do it
for x in *; do # <--- bad for many reasons
echo the file name is $x
done
Using find is better, for instance.
for x in `find . -maxdepth 1 -type f`; do # <-- assume no filename has spaces
echo the file name is $x
done
find
has a lot of options to filter results by name, by date, by owner... whatever. It is very powerful.
However using a for-find
FAILS if the filename contains spaces. To fix that use...
while read x; do
echo the file name is $x
done < <(find . -maxdepth 1 -type f)
Or if you don't like that weird done
syntax, instead you can use:
result=`find . -maxdepth 1 -type f`
while read x; do
echo the file name is $x
done <<< $result
However, what if the filename contains a linefeed?! Can that happen? Yes it can happen, but it is extremely rare. So if you are PARANOID you can do:
while read -r -d '' x; do
echo the file name is $x
done < <(find . -maxdepth 1 -type f -print0)
In my opinion the extra mess is not worth it, so I don't recommend it. People who put linefeeds in filenames deserve to feel pain.