3

I often create log/report files using various shell scripts. This is fine, but most of the time I only need to look at the contents briefly - usually the same day or up to a week later.

What I'd like to do is to be able to flag the file as expiring so after a given period of time, the file is either deleted or moved to an archive directory.

As I have control over the file creation, I could of course give it a certain extension e.g. .SEF and write a service to parse the directories on my hard drive periodically, but this seems a bit clunky.

I've also looked into custom file attributes:

http://social.msdn.microsoft.com/Forums/en/netfxbcl/thread/173ab6de-3cd8-448f-8c42-de9d2240b913

But I'd need to write something to add the file attribute to the file. So in Powershell/SFU or similar:

cat JobOutput.txt | grep -e Error | expire > report.txt

Or plain old windows:

type JobOutput.txt | findstr Error | expire > report.txt

Has anyone done anything remotely like this?

Edit:

What I've described above is just one facet of what I'd like to do. For example, you might want to send an email which requires a response within a time limit and then gets deleted after that time.

Another example might be a document you release with details of a temporary system workaround which you don't want used after a given amount of time.

We're into the vagaries of various applications now of course, but the idea of custom file attributes looked promising and appealing. The problem of course would be how they'd be applied to each of the file types...

For now, I'm happy to close this based on the question as I originally posed it, in which case, I could define a custom extension which gets cleared up by a scheduled job.

Solution:

OK, I decided to go with the Powershell solution (with a slight tweak) since I'm pretty familiar with that:

$HowManyDays = 15  
$LastWrite = (Get-Date).AddDays(-$HowManyDays)  
Get-ChildItem -recurse -include *.sef |     
Where {$_.LastWriteTime -le "$LastWrite"} |     
Remove-Item 

Many thanks for your help on this.

Robbie Dee
  • 1,939
  • 16
  • 43
  • Using a custom file extension in conjunction with the last modified date is the most common way to do this - why is that not an option? – Oded Aug 30 '12 at 10:27
  • A batch file calling robocopy via the windows task scheduler is the simplest way – Alex K. Aug 30 '12 at 10:28
  • @Oded What would be responsible for the deletion/archiving part? – Robbie Dee Aug 30 '12 at 10:34
  • @AlexK. I'm not familiar with robocopy - could you give me an example? – Robbie Dee Aug 30 '12 at 10:35
  • A third party process - as @AlexK. suggests, using the task scheduler (with a powershell script or batch file) is simplest. – Oded Aug 30 '12 at 10:35
  • @RobbieDee its a microsoft command line file-copy utility; it can copy/mirror/move based on age (or pretty much anything else you can think of). It ships with vista/7 & is downloadable for older versions; http://ss64.com/nt/robocopy.html – Alex K. Aug 30 '12 at 10:37
  • 5
    E.g. move files in dir + subdirs older than 7 days; `robocopy "c:\from" "c:\to" /s *.log /MINAGE:7 /MOV` – Alex K. Aug 30 '12 at 10:42
  • @AlexK. Superb - exactly what I need! :) – Robbie Dee Aug 30 '12 at 10:59
  • IF you change `*.log` to `*.*` it should do it for all files, regardless of type? – SSS Sep 24 '12 at 07:59
  • That would work for say, a log folder but I was looking for a system wide solution... – Robbie Dee Sep 25 '12 at 10:24
  • You could schedule a robocopy, shellscript, ruby, python, o whatever you want to delete or move files. To schedule you could use the AT command from the command line. – user1154664 Sep 25 '12 at 16:11

4 Answers4

1

If you are fairly happy with what Alex K. provided then it should be easy to tweak.

Create a cmd file, when you call it pass in the file search pattern, if thats *.* then so be it.

In the cmd file call:

robocopy "c:\from" "c:\to" /s %1 /MINAGE:7 /MOV 

See: How do I pass command line parameters to a batch file?

Community
  • 1
  • 1
Peter
  • 9,643
  • 6
  • 61
  • 108
0

We use a small script in Powershell scheduled through task scheduler to remove logs older than 15 days.

You can remove the files or move them. Here is the script:


$HowManyDays = 15

$LastWrite = (Get-Date).AddDays(-$HowManyDays)

Get-ChildItem *.log |
    Where {$_.LastWriteTime -le "$LastWrite"} |
    Remove-Item

tucaz
  • 6,524
  • 6
  • 37
  • 60
0

I'd go for the scheduled invocation too.

If you don't like to invoke Powershell or download robocopy, you could stick to a command-line one-liner:

forfiles /p "C:\Path" /m *.log /d -7 /c "cmd /c del @file"

I tweaked and baked this into a batch that I can schedule to clean any folder + subfolders. Just for reference purposes, that will give you stuff like:

forfiles /p "C:\Path" /m *.* /d -7 /c "cmd /c if [@isdir]==[TRUE] rmdir @file /s /q"
Grimace of Despair
  • 3,436
  • 26
  • 38
0

You have two problems: 1. How to indicate which files should be subject to "expiry". 2. How to (quickly) find expired files.

For 1, you need to decide if a file extension is an acceptable approach or whether you need to use some custom file attribute, and whether this will be stored in the NTFS filesystem or embedded in the content of the file.

For 2, you could create an index using eg locate32 but this will only allow you to quickly find files based on extension; custom attributes are out -- unless you want to grab the source to locate32 and add the ability to index custom attributes. It could be very useful (eg Word Documents with custom properties or versions in DLLs). Another approach to indexing would be to create symbolic links to files subject to expiration in a "to be expired" directory.

The actual deletion is trivial and you can use any of the solutions posted here; using an index just makes it faster to find expired files.

Colin 't Hart
  • 7,372
  • 3
  • 28
  • 51
  • I've just added some more detail to the question. I'll have a think about the proffered solutions in light of this and post back later. – Robbie Dee Sep 26 '12 at 16:10