-4

I have written a motion detection winform c# desktop app.

The motion frames are saved as individual jpegs to my hard drive.

There are 4 cameras I record from. This is represented by the variable:

   camIndex

Each jpeg's is under a file structure:

c:\The Year\The Month\The Day\The Hour\The Minute

to ensure the directories did not get too many files in each one.

The intention is for my app to be be running 24/7. The app can stop for reasons such as system reboot or that the User chooses to temporarily close it down.

At the moment I have a timer than runs every 5 minutes to delete files that are say more than 24 hours old.

I have found that my following code is memory intensive and over a period of a few days the explorer.exe has climbed in the RAM memory.

My motion app needs to be on all the while so a low memory footprint is essential for this 'archiving'...

The following code is long and seems to me hugely inefficient. Is there a better way I can achieve my aims?

I use this code:

List<string> catDirs = Directory.EnumerateDirectories(Shared.MOTION_DIRECTORY, "*", SearchOption.TopDirectoryOnly).ToList();
for (int index = 0; index < catDirs.Count; index++)
{
   for (int camIndex = 0; camIndex < 4; camIndex++)
   {
       if (Directory.Exists(catDirs[index] + "\\Catalogues\\" + camIndex.ToString()))
       {
           List<string> years = GetDirectoryList(catDirs[index] + "\\Catalogues\\" + camIndex.ToString(), true);
           if (years.Count == 0)
           {
                Directory.Delete(catDirs[index]);
            }
            for (int yearIndex = 0; yearIndex < years.Count; yearIndex++)
            {
                DirectoryInfo diYear = new DirectoryInfo(years[yearIndex]);
                List<string> months = GetDirectoryList(years[yearIndex], true);
                if (months.Count == 0)
                {
                  Directory.Delete(years[yearIndex]);
                }
                for (int monthIndex = 0; monthIndex < months.Count; monthIndex++)
                {
                    DirectoryInfo diMonth = new DirectoryInfo(months[monthIndex]);
                    List<string> days = GetDirectoryList(months[monthIndex], true);
                    if (days.Count == 0)
                    {                          
                        Directory.Delete(months[monthIndex]);                           
                    }
                    for (int dayIndex = 0; dayIndex < days.Count; dayIndex++)
                    {
                        DirectoryInfo diDay = new DirectoryInfo(days[dayIndex]);
                        List<string> hours = GetDirectoryList(days[dayIndex], true);
                        if (hours.Count == 0)
                        {
                            Directory.Delete(days[dayIndex]);                              
                        }
                        for (int hourIndex = 0; hourIndex < hours.Count; hourIndex++)
                        {
                            DirectoryInfo diHour = new DirectoryInfo(hours[hourIndex]);
                            List<string> mins = GetDirectoryList(hours[hourIndex], false);
                            if (mins.Count == 0)
                            {
                                Directory.Delete(hours[hourIndex]);
                            }
                            for (int minIndex = 0; minIndex < mins.Count; minIndex++)
                            {
                                bool deleteMe = false;
                                DirectoryInfo diMin = new DirectoryInfo(mins[minIndex]);
                                DateTime foundTS = new DateTime(Convert.ToInt16(diYear.Name), Convert.ToInt16(diMonth.Name), Convert.ToInt16(diDay.Name),
                                Convert.ToInt16(diHour.Name), Convert.ToInt16(diHour.Name), 00);
                                double minutesElapsed = upToDateDate.Subtract(foundTS).TotalMinutes;
                               if (minutesElapsed > diff)
                               {
                                   deleteMe = true;
                               }
                               if (deleteMe)
                               {
                                   Directory.Delete(mins[minIndex], true);
                               }
                           }
                       }
                   }
               }
           }
       }
   }
Andrew Simpson
  • 6,883
  • 11
  • 79
  • 179
  • 3
    That is a *lot* of nested `for` loops! Your question would be clearer if you explained exactly what the code does. What is the directory structure? How are the files named? What are you using to determine their age? – Matt Burland Jan 06 '15 at 20:54
  • You need to extract out some logic into methods and invert your guards. – ChaosPandion Jan 06 '15 at 20:56
  • @MattBurland Hi, I admit my code is rubbish. I had tried to think of a way of doing a'lookup table' instead of loops. There are 2 issues here. The amount of nested loops and the actual delete itself – Andrew Simpson Jan 06 '15 at 20:56
  • @AndrewSimpson: Then you need to explain what you are doing a little better. Am I right in thinking that you have your files organized in nested folders by year / month / day / hour? And why do you need your clean up to run every 5 minutes? Do you really need to clean up that aggressively? – Matt Burland Jan 06 '15 at 20:59
  • @MattBurland yes, matt that is correct. I could extend the clearing up to run less frequent - and I have. But explorer.exe process mem goes up regardless of my codes poor design – Andrew Simpson Jan 06 '15 at 21:01
  • Here's what I might consider doing, keep a `Queue` of the files you've written and just delete them off the end of the `Queue`. If you know the rate that you are writing files you could clear them up by just deleting the oldest *n* files. – Matt Burland Jan 06 '15 at 21:01
  • Generally not a great idea to swallow exceptions everywhere either – geedubb Jan 06 '15 at 21:01
  • 2
    @AndrewSimpson - Here you go: http://stackoverflow.com/questions/268132/invert-if-statement-to-reduce-nesting – ChaosPandion Jan 06 '15 at 21:01
  • 1
    Really looks like nested directories by time frames. Why do you need that if you flush everything every 24 hour? Could simplify the code to 2-3 lines if you did not have to loop like that. – Pierre-Luc Pineault Jan 06 '15 at 21:03
  • @Pierre-LucPineault Yes, it is worth giving that a go. But the explorer.exe wil still porbably rise over a period of days – Andrew Simpson Jan 06 '15 at 21:03
  • 1
    @AndrewSimpson: You could simply rebuild the queue when the program starts back up. Just scan your directories and rebuild the queue. – Matt Burland Jan 06 '15 at 21:04
  • 1
    Don't know if this is helpful or not but you could look into `File.GetCreationTime`: http://msdn.microsoft.com/en-us/library/system.io.file.getcreationtime%28v=vs.110%29.aspx – pcnThird Jan 06 '15 at 21:05
  • @MattBurland, hmm, yes. that would give me my lookup table. Take the hit on startup. Think I will give that a go. thanks – Andrew Simpson Jan 06 '15 at 21:05
  • this link : http://stackoverflow.com/questions/186737/whats-the-fastest-way-to-delete-a-large-folder-in-windows/6208144#6208144 shows I am not going mad. Repeated requests to delete many files causes a problem. Simple enough question. I do not know how to explain it any simpler than I have – Andrew Simpson Jan 06 '15 at 21:10
  • How many files do you have in a 24 hour period? – Pierre-Luc Pineault Jan 06 '15 at 21:11
  • @Pierre-LucPineault varies on the motion detected but each file is about 10kb. The maximum memory taken can be 2.2 gb per day – Andrew Simpson Jan 06 '15 at 21:12
  • hence why I need to be aggressive to keep on top of it – Andrew Simpson Jan 06 '15 at 21:13
  • 1
    http://stackoverflow.com/questions/11000972/delete-files-that-are-more-than-than-10-days-old-using-linq – Pedro Jan 06 '15 at 21:23
  • Another quick thought - if I understand this correctly, you only create new files if your motion sensor detects something right? So how about only firing off the clean up (maybe as a separate low priority task) only when you start recording? That way you're not searching through masses of files that haven't changed since the last clean up. – Matt Burland Jan 06 '15 at 21:36
  • @MattBurland that is a very good idea and worth testing. Thank you – Andrew Simpson Jan 06 '15 at 21:37
  • I have added more info to my question. I actually have my answer and my only concern is that the people who have helped with good info do not have their time wasted. I have removed the try{}catch{} (as i should have done so at the start) and I have added the context the code is running in. – Andrew Simpson Jan 07 '15 at 08:14
  • Just to add a comment to the people who voted to close this question down.. Other people HAVE understood the question and answered and commented so why can you people not understand if you realize my point? – Andrew Simpson Jan 07 '15 at 08:44

1 Answers1

5

This might help, not tested efficiency though:

Directory
.GetFiles(@"[Path of your root directory]", "*.*", SearchOption.AllDirectories)
.Where(item =>
{
    try
    {
        var fileInfo = new FileInfo(item);
        return fileInfo.CreationTime < DateTime.Now.AddHours(-24);
    }
    catch (Exception)
    {
        return false;
    }
})
.ToList()
.ForEach(File.Delete);

Make sure you add proper exception handling and avoid zombie try / empty catch (not recommended)

Jsinh
  • 2,539
  • 1
  • 19
  • 35
  • 3
    I would avoid the `ToList` and use a `foreach` loop. – ChaosPandion Jan 06 '15 at 21:20
  • 1
    Agree, that might help saving some CPU usage especially when list goes big. – Jsinh Jan 06 '15 at 21:22
  • 1
    With 2.2Gb worth of files (weighting 10kb each, which means a _lot_), enumerating them one by one will be highly, _highly_ inefficient. That's a lot of I/O operations for a 5 minutes timeframe. I think burning stuff at the directory level was the good approach (albeit badly implemented), and should be in a much bigger timeframe. – Pierre-Luc Pineault Jan 06 '15 at 21:22
  • @Pierre-LucPineault - sure, but definitely better than the code posted above. Also if this check goes around every five minutes then i don't think that would be a big problem. Plus size of file doesn't matter, it would read file attribute for creation date time. Do you suggest any better way? Also - probably use http://msdn.microsoft.com/en-us/library/hh285054(v=vs.110).aspx if your list goes really big (@Andrew Simpson) – Jsinh Jan 06 '15 at 21:26
  • 1
    The size don't matter, but the amount does. I'd maybe throw a Parallel.ForEach in there to try to speed things up a bit, if you really want to enumerate all files instead of blowing up a folder. – Pierre-Luc Pineault Jan 06 '15 at 21:30
  • 1
    @Pierre-LucPineault - Yes Parallel.ForEach would be a smart way to handle the amount of files rush - http://msdn.microsoft.com/en-us/library/dd460720(v=vs.110).aspx (cc: Andrew Simpson) – Jsinh Jan 06 '15 at 21:32
  • tested the above answer and i tested using Parallel.ForEach. The Parallel.ForEach was marginally slower for some reason by about 200 milliseconds. I have quad core processor as well so was surprised. But both gave good results. thanks – Andrew Simpson Jan 07 '15 at 08:43