4

What I'm trying to accomplish is the following: Suppose I have a function that writes an image to a File directory (either SD or internal cache). After writing out the File, I perform a check to see if my image directory is within a certain total file size (right now, I'm using this function to recursively calculate the directory's file size). If the file that I just added makes that directory too big, then what I want to do is keep deleting older files until we are just below that max file size.

I was thinking of first sorting the File directory members from oldest first (via comparator, ascending order using this example), then convert the array into an ArrayList to get its Iterator, then while our directory file size is still above the max file size, and I still have files to iterate to, I delete the older files until I break out of that while loop. Is there a more efficient way of accomplishing this?

Community
  • 1
  • 1
Diego Tori
  • 399
  • 5
  • 19

1 Answers1

1

Your bottleneck is most likely going to be file system operations (ie: reading the directory contents and deleting the files), not the in-memory manipulation, so you probably shouldn't worry too much about the efficiency of the latter so long as you don't do something grossly inefficient.

The rough algorithm you describe sounds fine. You can avoid the ArrayList conversion by simply doing something like:

for (Pair pair : pairs) {
    if (totalSize <= maxSize) {
        break;
    }

    totalSize -= pair.f.length();
    pair.f.delete();
}
Laurence Gonsalves
  • 137,896
  • 35
  • 246
  • 299