I'm developing a script to process and move away compressed files that get dropped into a certain folder.
The script works perfectly as long as what gets dropped into the folder is a compressed file. However, if the script executes when there are no compressed files to process the bash function ${f%.gz} gives unexpected results.
Here is the script, with an example of the problem case afterwards:
FILES="$INGEST_DIR/*.gz"
for f in $FILES
do
JUSTFILENAME=${f##/*/}
syslog -s -l n "Archiving \"$JUSTFILENAME\""
UNZIPPEDPATH=${f%.gz}
syslog -s -l n "Moving \"$UNZIPPEDPATH\""
UNZIPPEDNAME=${UNZIPPEDPATH##/*/}
syslog -s -l n " to \"ASR_DIR/$UNZIPPEDNAME\""
syslog -s -l n "gunzip-ing $f"
gunzip $f
mv "$UNZIPPEDPATH" "$ASR_DIR/$UNZIPPEDNAME"
done
Again, it works perfectly if there's at least one .gz file in the target directory.
If there aren't any .gz, but there are other files in the directory (which must be there for other reasons) $FILES contains the expanded $INGEST_DIR plus the /*.gz, like this:
INGEST_DIR=/path/to/foo
FILES="$INGEST_DIR/*.gz"
echo $FILES
will show
/path/to/foo/*.gz
That isn't especially bad except that
for f in $FILES
do
UNZIPPEDPATH=${f%.gz}
echo $UNZIPPEDPATH
done
yields
somefile.txt someotherfile.exe yetsomeotherfile.dat
So is there an elegant way to not iterate if there are no such compressed files to handle? My script is working as well as it is because I just learned about ${f##/*/} and ${f%.gz} from this SO question & answer, so I'm thinking there might be a better way than
FILES="$INGEST_DIR/*.gz"
to start things off... or something to do right away before heading into the for loop.