It seems the main conclusion is not to use ls
. Back in Pleistocene age of Unix programming, they used ls
; however, these days, ls
is best-restricted to producing human-readable displays only. A robust script for anything that can be thrown at your script (end lines, white spaces, Chinese characters mixed with Hebrew and French, or whatever), is best achieved by some form of globbing (as recommended by others here BashPitfalls).
#!/bin/bash
for file in ./*; do
[ -e "${file}" ] || continue
# do some task, for example, test if it is a directory.
if [ -d "${file}" ]; then
echo "${file}"
fi
done
The ./
is maybe not absolutely necessary, but it may help if the file begins with a "-", clarifying which file has the return line (or lines), and likely some other nasty buggers. This is also a useful template for specific files (.e.g, ./*.pdf
). For example, suppose somehow the following files are in your directory: "-t" and "<CR>
t". Then (revealing other issues with ls
when using nonstandard characters)
$ ls
-t ?t
$ for file in *; do ls "${file}"; done
-t ?t
?t
whereas:
$ for file in ./*; do ls "${file}"; done
./-t
./?t
also
$ for file in ./*; do echo "${file}"; done
./-t
./
t
A workaround with POSIX commands can be achieved by --
$ for file in *; do ls -- "${file}"; done # work around
-t
?t