I have a line of code I use for backing up client SQL databases that will trim the number of stored backed up files to 7 days to prevent it growing out of control over time.
However I have noticed a problem. The client is running the script very in frequently so when they do it deletes everything apart the new file run at the time. I could tell them do it everyday but you know how it is, it wont happen and I need to mitigate against this.
I would like it so that the code runs and doesn't delete anything older than 7 days but keeps the most recent 7 days worth of files if that makes sense?
forfiles -p "Cache" -s -m *.* -d -7 -c "cmd /c del @path"
Is there a way to alter this so it reads the date of the files and always keeps the most recent 7 days worth, no matter when the script is run. Rather than just seeing everything in here (at the point of execution) is older than 7 days and delete it all?