168

I want to delete scripts in a folder from the current date back to 10 days. The scripts looks like:

2012.11.21.09_33_52.script
2012.11.21.09_33_56.script
2012.11.21.09_33_59.script

The script will run in every 10 day with Crontab, that's why I need the current date.

starball
  • 20,030
  • 7
  • 43
  • 238
Steve88
  • 2,366
  • 3
  • 24
  • 43

3 Answers3

523

find is the common tool for this kind of task :

find ./my_dir -mtime +10 -type f -delete

EXPLANATIONS

  • ./my_dir your directory (replace with your own)
  • -mtime +10 older than 10 days
  • -type f only files
  • -delete no surprise. Remove it to test your find filter before executing the whole command

And take care that ./my_dir exists to avoid bad surprises !

Gilles Quénot
  • 173,512
  • 41
  • 224
  • 223
  • `find /home/scripts/*.script -mtime +10 type f -delete` will be ok for delete these? 2012.11.21.09_33_52.script 2012.11.21.09_33_56.script 2012.11.21.09_33_59.script – Steve88 Nov 21 '12 at 09:00
  • It depends of the date of the modification, like what `ls -l` displays. Are the date the same as `ls -l` ? But a simple test will tell you =) – Gilles Quénot Nov 21 '12 at 09:06
  • yeah it is a creartion date of the script – Steve88 Nov 21 '12 at 09:11
  • So, what are you waiting to try ? =) – Gilles Quénot Nov 21 '12 at 09:26
  • 96
    Be *VERY* careful to supply an absolute path on commands like these! Once, using a command very much like this in a cron job, I accidentally deleted every file on my production mail server older than 10 days, which I can tell you was no fun to recover from. – DSimon May 28 '14 at 20:00
  • Make sure you put the delete at the end! I ended up deleting a directory instead of just the files, didn't matter to much in my case but could have been painful. – alimack Apr 18 '16 at 15:13
  • Another important point regarding this comment is, it says -mtime as older than n days. There are 3 options that check if file is older than n days. They are atime, ctime and mtime. Please use the option according to need. – Deepan Prabhu Babu Feb 09 '17 at 20:08
  • 1
    @DSimon Thanks for sharing your horror story to help us avoid our own! I had a few directories to do this to, so inspired by your comment, inside my `for a in ...` loop, I added a `if [ -d $a ]; then...` to my script! – theglossy1 Jun 07 '17 at 18:00
  • `mtime` is the most common, mostly FS/OS don't care about `atime` (depends of fine settings in `/etc/fstab` mostlly) – Gilles Quénot Jun 07 '17 at 22:25
  • with mtime, if a file is being used frequently, it won't get deleted am I right ? Why not use ctime ? – MaXi32 Dec 09 '20 at 17:01
26

Just spicing up the shell script above to delete older files but with logging and calculation of elapsed time

#!/bin/bash

path="/data/backuplog/"
timestamp=$(date +%Y%m%d_%H%M%S)    
filename=log_$timestamp.txt    
log=$path$filename
days=7

START_TIME=$(date +%s)

find $path -maxdepth 1 -name "*.txt"  -type f -mtime +$days  -print -delete >> $log

echo "Backup:: Script Start -- $(date +%Y%m%d_%H%M)" >> $log


... code for backup ...or any other operation .... >> $log


END_TIME=$(date +%s)

ELAPSED_TIME=$(( $END_TIME - $START_TIME ))


echo "Backup :: Script End -- $(date +%Y%m%d_%H%M)" >> $log
echo "Elapsed Time ::  $(date -d 00:00:$ELAPSED_TIME +%Hh:%Mm:%Ss) "  >> $log

The code adds a few things.

  • log files named with a timestamp
  • log folder specified
  • find looks for *.txt files only in the log folder
  • type f ensures you only deletes files
  • maxdepth 1 ensures you dont enter subfolders
  • log files older than 7 days are deleted ( assuming this is for a backup log)
  • notes the start / end time
  • calculates the elapsed time for the backup operation...

Note: to test the code, just use -print instead of -print -delete. But do check your path carefully though.

Note: Do ensure your server time is set correctly via date - setup timezone/ntp correctly . Additionally check file times with 'stat filename'

Note: mtime can be replaced with mmin for better control as mtime discards all fractions (older than 2 days (+2 days) actually means 3 days ) when it deals with getting the timestamps of files in the context of days

-mtime +$days  --->  -mmin  +$((60*24*$days))
MarcoZen
  • 1,556
  • 22
  • 27
  • Hi @MarcoZen. I have 2 `find` and `delete` commands writing to same `$log`. Why does it create three log files with the first two having no content and the last file having the list of files deleted. What can I do so that the find and delete command doesn't generate multiple files? – Fokwa Best Jan 25 '18 at 17:32
  • @FokwaBest - Could be you created anoher log file ? bcos of the timestamp ? Are u using the above code fully ? Can u pastebin for me to check ? – MarcoZen Jan 27 '18 at 04:51
  • Hi @MarcoZen, I had to remove `_%H%M%S`. For small files, only one log file was generated but when the size of the number of files to delete is large, multiple log file was generated with different `_%H%M%S`. After removing this part, everything is written to one file. – Fokwa Best Jan 27 '18 at 06:14
  • @FokwaBest - Interesting ... I dont have that problem .. Good that u have solved it. I think you are calling this script several times ? – MarcoZen Jan 28 '18 at 14:25
9

If you can afford working via the file data, you can do

find -mmin +14400 -delete
glglgl
  • 89,107
  • 13
  • 149
  • 217
  • 3
    To the [anonymous editor](http://stackoverflow.com/review/suggested-edits/5805137): Which version if `find` has a `-rm-rf` option? – glglgl Sep 19 '14 at 07:10