0

I am trying to write a bash script to list the size of each file/subdir of the current directory, as follows:

for f in $(ls -A)
do
    du -sh $f
done

I used ls -A because I need to include hidden files/dirs starting with a dot, like .ssh. However, the script above cannot handle spaces if the file names in $f contain spaces. e.g. I have a file called:

books to borrow.doc

and the above script will return:

du: cannot access `books': No such file or directory
du: cannot access `to': No such file or directory
du: cannot access `borrow.doc': No such file or directory

There is a similar question Shell script issue with filenames containing spaces, but the list of names to process is from expanding * (instead of ls -A). The answer to that question was to add double quotes to $f. I tried the same, i.e., changing

    du -sh $f

to

    du -sh "$f"

but the result is the same. My question is how to write the script to handle spaces here?

Thanks.

Community
  • 1
  • 1
thor
  • 21,418
  • 31
  • 87
  • 173

5 Answers5

3

Dont parse the output from ls. When the file contains a space, the $f contains the parts of teh filename splitted on the space, and therefore the double quotes doesn't got the whole filename

The next will work and will do the same as your script

GLOBIGNORE=".:.."  #ignore . and ..
shopt -s dotglob   #the * will expand all files, e.g. which starting with . too
for f in *
do
    #echo "==$f=="
    du -sh "$f"  #double quoted (!!!)
done
clt60
  • 62,119
  • 17
  • 107
  • 194
  • Shouldn't need to directly set `GLOBIGNORE` to `.:..`, since `The file names . and .. are always ignored when GLOBIGNORE is set and not null.` – Reinstate Monica Please Sep 14 '14 at 22:02
  • Yes, but don't hurts, and ensures that it is really SET to something. – clt60 Sep 14 '14 at 22:04
  • True. And I was actually a bit confused about how this worked when writing my previous comment. `.` and `..` still won't match a naked `*`, even if `GLOBIGNORE` is nulled out and dotglob is set (presumably since `.` and `..` aren't really filenames). They will match `.*` for some reason, unless you set `GLOBIGNORE` to a non-null value. – Reinstate Monica Please Sep 14 '14 at 22:15
2

Unless the directory is so big that the list of file names is too big:

du -sh * .*

Be aware that this will include . and .., though. If you want to eliminate .. (probably a good idea), you can use:

for file in * .*
do
    [ "$file" = ".." ] && continue
    du -sh "$file"  # Double quotes important
done

You can consider assigning the names to an array and then working on the array:

files=( * .* )
for file in "${files[@]}"
do
    ...
done

You might use variations on that to run du on groups of names, but you could also consider using:

printf "%s\0" "${files[@]}" | xargs -0 du -sh
Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
  • `( shopt -s dotglob; du -sh * )` would eliminate `.` and `..` as well. Still a bit confused as to why `.*` matches `.` and `..` – Reinstate Monica Please Sep 14 '14 at 22:20
  • I tried the second option. `files=( * .*) ...`, all that I got was `1.7G . 1.7G . 1.7G . ... ` Am I missing something? – thor Sep 15 '14 at 18:28
  • It depends on what you put in place of the three dots. If it was `du -sh "$file"`, you should be OK. If you put something else, there are certainly ways you can get the same size over and over again (`du -sh "$files"`, with the plural name, would be one such). – Jonathan Leffler Sep 15 '14 at 18:31
  • I was using `du -sh $file`. Now that I realize that I forgot to add quotes, I have tried exactly your option 1, I got a bunch of `du: invalid zero-length file name`. I am using MinGW32 from mingw.org. What might the issue be? – thor Sep 15 '14 at 18:52
1

I generally prefer using the program find if a for loop would cause headaches. In your case, it is really simple:

$ find . -maxdepth 1 -exec du -sh '{}' \;

There are a number of security issues with using -exec which is why GNU find supports the safer -execdir that should be preferred if available. Since we are not recursing into directories here, it doesn't make a real difference, though.

The GNU version of find also has an option (-print0) to print out matched file names separated by NUL bytes but I find the above solution much simpler (and more efficient) than first outputting a list of all file names, then splitting it at NUL bytes and then iterating over it.

5gon12eder
  • 24,280
  • 5
  • 45
  • 92
0

Try this:

    ls -A |
    while read -r line
    do
    du -sh "$line"
    done

Instead of checking for the ls -A output word by word, the while loop checks line by line. This way, you don't need to change the IFS variable.

0

Time to summarize. Assuming you are using Linux, this should work in most (if not all) cases.

find -maxdepth 1 -mindepth 1 -print0 | xargs -r -0 du -sh
Vytenis Bivainis
  • 2,308
  • 21
  • 28