251

I'm writing a bash script that needs to delete old files.

It's currently implemented using :

find $LOCATION -name $REQUIRED_FILES -type f -mtime +1 -delete

This will delete of the files older than 1 day.

However, what if I need a finer resolution that 1 day, say like 6 hours old? Is there a nice clean way to do it, like there is using find and -mtime?

Tom Feiner
  • 20,656
  • 20
  • 48
  • 51

9 Answers9

379

Does your find have the -mmin option? That can let you test the number of mins since last modification:

find $LOCATION -name $REQUIRED_FILES -type f -mmin +360 -delete

Or maybe look at using tmpwatch to do the same job. phjr also recommended tmpreaper in the comments.

Bitcoin Murderous Maniac
  • 1,209
  • 1
  • 14
  • 27
Paul Dixon
  • 295,876
  • 54
  • 310
  • 348
  • 7
    Using --mmin +X returns all files with my find. My fault for not checking this first, but this command just deleted most of my home directory. For me, --mmin -X is the correct argument. – brandones Oct 16 '13 at 00:08
  • **tmpreaper** is a fork of tmpwatch. It is safer, and exists as a debian package in the distro. Benefits over find -delete : tmpreaper will not remove symlinks, sockets, fifos, or special files – Nadir Jul 02 '15 at 09:29
  • Point out that $REQUIRED_FILES need to be in quotes (double quotes) and you can use a pattern like: "*.txt" for all .txt files or files beginning with a pattern like: "temp-*" to delete all files named with temp- – achasinh Oct 04 '17 at 10:01
  • @PaulDixon how to modify this so that `$LOCATION` and `$REQUIRED_FILES` can both have multiple values such as `dir1 dir2` and `*.txt *.tmp` ? – Enissay Jul 30 '18 at 11:07
  • @Enissay $LOCATION is a single directory. For multiple extensions you'd probably want to use a pattern with -regex - see https://stackoverflow.com/questions/5249779/how-to-use-regex-in-file-find – Paul Dixon Jul 30 '18 at 19:57
27

Here is the approach that worked for me (and I don't see it being used above)

$ find /path/to/the/folder -name '*.*' -mmin +59 -delete > /dev/null

deleting all the files older than 59 minutes while leaving the folders intact.

Ani Menon
  • 27,209
  • 16
  • 105
  • 126
Axel Ronsin
  • 271
  • 3
  • 3
  • 2
    Better to single-quote `'*.*'` or the shell will expand it to actual filenames instead of keeping it as a wildcard for `find` to resolve. This breaks `find`'s recursive operation on subdirs. – MestreLion Feb 20 '19 at 17:51
  • 2
    Also keep in mind that `-name '*.*'` will not delete files that have no extension, such as `README`, `Makefile`, etc. – MestreLion Feb 20 '19 at 17:53
  • 3
    If you're trying to just match files, use `-type f` instead – nickf Aug 11 '20 at 15:38
12

You could to this trick: create a file 1 hour ago, and use the -newer file argument.

(Or use touch -t to create such a file).

xtofl
  • 40,723
  • 12
  • 105
  • 192
  • 1
    there is no -older switch (at least in my find command), and that's what would be needed. -newer doesn't help. – iconoclast Feb 17 '11 at 06:53
  • can you give a touch command that would generate a file 1 hour old that will work on machines that can't use -mmin? (If you're on Linux, -mmin is available, if not then date and other commands are also feeble in comparison.) – iconoclast May 10 '11 at 17:20
  • 3
    @iconoclast `touch -t $(date -d '-1 hour' +%Y%m%d%H%M.00) test` Creates file `test` that's always 1 hour old. – rovr138 Jan 20 '18 at 15:29
  • 3
    you can also do `-not -newer X` to achieve what you're looking for – nickf Aug 11 '20 at 15:38
  • To `rm` files and directories older than `file.ext` run `rm -r \`find -maxdepth 1 -not -newer file.ext\``. To `rm` files and directories newer than `file.ext` do `rm -r \`find -maxdepth 1 -newer file.ext\``. To place `file.ext` where you want it in time run `touch -t $(date -d '-1 hour' +%Y%m%d%H%M.00) file.ext` where `'-1 hour'` specifies "1 hour ago". Credit: [xoftl](https://stackoverflow.com/a/249608/4682839), [rovr138](https://stackoverflow.com/a/249608/4682839#comment83702361_249608), [nickf](https://stackoverflow.com/a/249608/4682839#comment112040489_249608). – young_souvlaki Jul 09 '21 at 17:53
  • WARNING: `-not -newer file.ext` will delete `file.ext`. – young_souvlaki Jul 09 '21 at 18:11
1

For SunOS 5.10

 Example 6 Selecting a File Using 24-hour Mode


 The descriptions of -atime, -ctime, and -mtime use the  ter-
 minology n ``24-hour periods''. For example, a file accessed
 at 23:59 is selected by:


   example% find . -atime -1 -print




 at 00:01 the next day (less than 24 hours  later,  not  more
 than one day ago). The midnight boundary between days has no
 effect on the 24-hour calculation.
Darren
  • 68,902
  • 24
  • 138
  • 144
1

If you do not have "-mmin" in your version of "find", then "-mtime -0.041667" gets pretty close to "within the last hour", so in your case, use:

-mtime +(X * 0.041667)

so, if X means 6 hours, then:

find . -mtime +0.25 -ls

works because 24 hours * 0.25 = 6 hours

Malcolm Boekhoff
  • 1,032
  • 11
  • 9
  • 1
    Was hopeful because this old UNIX doesn't have -mmin, but, sadly this is of no help as this old UNIX also does not like fractional values for mtime: find: argument to -mtime must be an integer in the range -2147483647 to 2147483647 – kbulgrien Apr 16 '20 at 09:27
1

If one's find does not have -mmin and if one also is stuck with a find that accepts only integer values for -mtime, then all is not necessarily lost if one considers that "older than" is similar to "not newer than".

If we were able to create a file that that has an mtime of our cut-off time, we can ask find to locate the files that are "not newer than" our reference file.

To create a file that has the correct time stamp is a bit involved because a system that doesn't have an adequate find probably also has a less-than-capable date command that could do things like: date +%Y%m%d%H%M%S -d "6 hours ago".

Fortunately, other old tools can manage this, albeit in a more unwieldy way.

To begin finding a way to delete files that are over six hours old, we first have to find the time that is six hours ago. Consider that six hours is 21600 seconds:

$ date && perl -e '@d=localtime time()-21600; \
  printf "%4d%02d%02d%02d%02d.%02d\n", $d[5]+1900,$d[4]+1,$d[3],$d[2],$d[1],$d[0]'
> Thu Apr 16 04:50:57 CDT 2020
202004152250.57

Since the perl statement produces the date/time information we need, use it to create a reference file that is exactly six hours old:

$ date && touch -t `perl -e '@d=localtime time()-21600; \
  printf "%4d%02d%02d%02d%02d.%02d\n", \
  $d[5]+1900,$d[4]+1,$d[3],$d[2],$d[1],$d[0]'` ref_file && ls -l ref_file
Thu Apr 16 04:53:54 CDT 2020
-rw-rw-rw-   1 root     sys            0 Apr 15 22:53 ref_file

Now that we have a reference file exactly six hours old, the "old UNIX" solution for "delete all files older than six hours" becomes something along the lines of:

$ find . -type f ! -newer ref_file -a ! -name ref_file -exec rm -f "{}" \;

It might also be a good idea to clean up our reference file...

$ rm -f ref_file
kbulgrien
  • 4,384
  • 2
  • 26
  • 43
0

-mmin is for minutes.

Try looking at the man page.

man find

for more types.

GavinCattell
  • 3,863
  • 20
  • 22
0

find $PATH -name $log_prefix"*"$log_ext -mmin +$num_mins -exec rm -f {} \;

shukshin.ivan
  • 11,075
  • 4
  • 53
  • 69
Eragonz91
  • 177
  • 1
  • 12
0

Here is what one can do for going on the way @iconoclast was wondering about in their comment on another answer.

use crontab for user or an /etc/crontab to create file /tmp/hour:

# m h dom mon dow user  command
0 * * * * root /usr/bin/touch /tmp/hour > /dev/null 2>&1

and then use this to run your command:

find /tmp/ -daystart -maxdepth 1 -not -newer /tmp/hour -type f -name "for_one_hour_files*" -exec do_something {} \;
satyr0909
  • 21
  • 3