Let's say we have a loop like this:
foreach($entries as $entry){ // let's say this loops 1000 times
if (file_exists('/some/dir/'.$entry.'.jpg')){
echo 'file exists';
}
}
I assume this has to access the HDD 1000 times and check if each file exists.
What about doing this instead?
$files = scandir('/some/dir/');
foreach($entries as $entry){ // let's say this loops 1000 times
if (in_array($entry.'.jpg', $files)){
echo 'file exists';
}
}
Question 1: If this accesses the HDD once, then I believe it should be a lot faster. Am I right on this one?
However, what if I have to check sub-directories for a file, like this:
foreach($entries as $entry){ // let's say this loops 1000 times
if (file_exists('/some/dir/'.$entry['id'].'/'.$entry['name'].'.jpg')){
echo 'file exists';
}
}
Question 2: If I want to apply the above technique (files in array) to check if the entries exist, how can I scandir()
sub-directories into the array, so that I can compare the file existence using this method?