2

I want to print out all sub-directory names with their content file names below the directory name. If a sub-directory is empty then don't print the directory name and go to the next sub-directory. The applicable part of my code:

for dirs in "$mydir"/*
do
   if [ -d "$dirs" -type f" ] && [ "find "$dirs" -type f" ]
   then
      echo "Processing directory $dirs"
         for subfiles in $dirs/*
         do
            echo "Encoding $subfiles"
         done
   fi
done

If I leave off the second condition of the first if statement then empty directories will print their name to screen and a * will be listed below that (I guess representing the fact that there's nothing in the directory). The portion after the && doesn't cause any errors, but it isn't preventing empty directories from seeing the rest of this section of the code.

How can I get this to work?

gregm
  • 157
  • 6
  • 20
  • 1
    See: [Bash checking if folder has contents](https://stackoverflow.com/q/20456666/3776858) – Cyrus May 12 '19 at 14:28
  • @Cyrus right, I know how to do that on it's own. I want to both check if an entry in a directory is in fact a dir AND if it has one or more files in it in one single command. Let's say that my Home directory contains /Pictures which has no files in it and a file called file1.txt. If I don't check if file1.txt is a dir then the script echos "Processing directory file1.txt" so I have first make sure what is a dir and what isn't. THEN if there is a directory I need to make sure that it contains files. Otherwise, I get notified that there is a directory, but the screen echos * for empty. – gregm May 12 '19 at 15:27
  • `if [[ -d "$dirs" ]] && files=( "$dirs"/* ) && [[ -e ${files[0]} || -L ${files[0]} ]]; then` -- it's not simple, but if you want something that's reliable and performs well, that's what you get. – Charles Duffy May 12 '19 at 20:13
  • @CharlesDuffy that would fail if the first thing alphabetically under `"$dirs"` was a directory. – Ed Morton May 12 '19 at 20:46
  • The word they use for directories they want to exclude is very explicitly "empty". A directory that contains subdirectories isn't empty. – Charles Duffy May 12 '19 at 20:48
  • Right. I'm saying if under dir `x` you have a sub-directory named `a` and a file named `b` then that test will exclude the non-empty directory `x` since `a` will be stored in `files[0]` and so will fail the `[[ -e ${files[0]} || -L ${files[0]} ]]` test. – Ed Morton May 12 '19 at 20:50

4 Answers4

0

Change your condition to test empty directories :

for dirs in "$mydir"/*
do
   if [ -d "$dirs" ] && [ -n "$(ls -A $dirs)" ]
   then
      echo "Processing directory $dirs"
         for subfiles in $dirs/*
         do
            echo "Encoding $subfiles"
         done
   fi
done
scrambler
  • 741
  • 4
  • 7
  • You mean `for subfiles in "$dirs"/*`? And `ls -A "$dirs"` (though that's not generally advisable; `ls` isn't specified well enough to make relying on its output or lack thereof portable across platforms). – Charles Duffy May 12 '19 at 20:11
  • ...as this is, though, it'll mibehave when your directory names have spaces. – Charles Duffy May 12 '19 at 20:16
0

Rather than looking through everything and checking if each is a file or non-empty directory, why not just only search for files and non-empty directories in the first place?

Based on your statement that I want to print out all sub-directory names with their content file names below the directory name. that'd just be:

find . -mindepth 2 -maxdepth 2 ! -path . -type f -print0 |
    awk -v RS='\0' -F'/' '!seen[$(NF-1)]++{print "dir", $(NF-1)} {print "file", $NF}'

For example:

$ find . -printf '%y %p\n'
d .
f ./file
d ./tmp1
d ./tmp2
f ./tmp2/bar
f ./tmp2/foo

$ find . -mindepth 2 -maxdepth 2 ! -path . -type f -print0 |
    awk -v RS='\0' -F'/' '!seen[$(NF-1)]++{print "Processing directory", $(NF-1)} {print "Encoding", $NF}'
Processing directory tmp2
Encoding bar
Encoding foo

Massage to suit. If you NEED to execute a shell command on each file then there's no need to get awk involved and you can just change the above to:

$ cat ../tst.sh
#!/bin/env bash
declare -A seen
while read -r -d '' line; do
    path="${line%/*}"
    dir="${path#*/}"
    file="${line##*/}"
    (( ! $(( seen[dir]++ )) )) && printf 'Processing directory %s\n' "$dir"
    printf 'Encoding %s\n' "$file"
done < <(find . -mindepth 2 -maxdepth 2 ! -path . -type f -print0)

$ ../tst.sh
Processing directory tmp2
Encoding bar
Encoding foo
Ed Morton
  • 188,023
  • 17
  • 78
  • 185
0

Solution in pure bash:

is_empty_dir() { [[ "*..." = "$(printf %s * .*)" ]]; }

It should safely handle file names starting with . or -.

It has zero exit status if current directory is empty, non-zero otherwise.


Example usage:

is_empty_dir() { [[ "*..." = "$(printf %s * .*)" ]]; }

if [[ -d /tmp ]]
then
    cd /tmp
    if is_empty_dir
    then
        echo "/tmp is empty"
    else
        echo "/tmp is NOT empty"
    fi
else
    echo "/tmp does not exist"
fi
jiwopene
  • 3,077
  • 17
  • 30
  • 1
    Note that the `function` keyword is a ksh-ism bash supports for backwards compatibility; however, bash's implementation isn't completely compatible with the ksh one (which makes some variable declarations local-by-default). Better to use the POSIX-compliant syntax `is_empty_dir() { ...` with no `function` keyword; see http://wiki.bash-hackers.org/scripting/obsolete – Charles Duffy May 12 '19 at 20:13
0

Enabling nullglob might simplify things:

shopt -s nullglob
if [ some/dir/* ]; then
    echo directory exists and is not empty
fi

When nullglob is set, glob expressions that do not match anything return an empty string.

Cole Tierney
  • 9,571
  • 1
  • 27
  • 35