0

If I have a process that creates output jpg files. I have a cleanup script where I want to clean all files except the 100 last files. I used this command:

find -H /tmp -name \*.jpg | xargs ls -rt | head -n -100 | xargs rm -f

This works fine, except when not a single jpg file is present, in that case the find command will produc no output; so ls -rt will start listing ALL files in /tmp and start removing random files that don't match the *.jpg patter. I use a workaround like this:

JPG_FILES=$(find -H /tmp -name \*.jpg)
if [ -n "$JPG_FILES" ]
then
    echo $JPG_FILES | xargs ls -rt | head -n -100 | xargs rm -f
fi

which works fine, but I think there might be a better workaround, or just a better way to do this kind of cleanup?

PS: I know about this question, but that gives me the bug where random old files are removed that are not jpgs.

Chris Maes
  • 35,025
  • 12
  • 111
  • 136
  • Use this [Charles Duffy's answer](https://stackoverflow.com/a/26765341/5291015) and modify the statement to `(( ++count > 100 ))` and add `-name "*.jpg"` to the `find` command; to suit your requirement – Inian Jul 18 '17 at 12:08
  • You could evaluate the result of the find and check if its empty and only continue if its not. Something like this [ ! -z `find -name 'test'` ] && echo found where you exchange && for the ls command etc. (You will need backticks around the find command) – telina Jul 18 '17 at 12:19
  • @telina you could at least have read my question to the end... – Chris Maes Jul 18 '17 at 14:53
  • I mean I only tried to help. Maybe I didn't understand what you want then. I thought it would be a nice way to put it in one line. – telina Jul 19 '17 at 09:16

0 Answers0