3

I have a folder with huge amount pictures(10000 files at least) and I need to get names of all this files using PHP. The problem is when I use scandir() I got error about memory limit. Also, I tried to use code like this:

          $files = [];
          $dir = opendir($this->path);
          $i = 0;
          while(($file = readdir($dir)) !== false) {
              $files[] = $file;
              $i++;
              if ($i == 100)
                  break;
          }

This code works fine, but it's not what I need. When I try to get all files, script still crashes.

Besides I thought about saving state of pointer in $dir somehow for using it later through AJAX requests and getting all files, but I can't find any solution for that purpose.

Is there any method of set limit and offset for reading files? I mean like pagination.

  • What, *exactly*, is your question? – Jay Blanchard Sep 29 '17 at 12:53
  • 1
    look into http://php.net/manual/en/class.directoryiterator.php perhapps – Scuzzy Sep 29 '17 at 12:54
  • No real question here, however running through 10000 files is going to cost you some memory. Either use caching and go through the files in small bulks with an interval inbetween them or get a better server. – Classified Sep 29 '17 at 12:54
  • Or both @Classified – GrumpyCrouton Sep 29 '17 at 13:00
  • 1
    or save them in a database (temp or static) – user2659982 Sep 29 '17 at 13:00
  • How about increasing memory limit ? `ini_set('memory_limit','16M');` – Lukas Sep 29 '17 at 13:00
  • [Change](https://stackoverflow.com/questions/11885191/how-to-increase-memory-limit-for-php-over-2gb) your memory limit either in the `php.ini` or with `ini_set` – DarkBee Sep 29 '17 at 13:01
  • 2
    Changing the memory limit will just delay the issue for another time where the data structure is larger. I would not recommend increasing the memory limit to extreme heights. Depending on how you need to use this data there are several solutions. If you need the data real time, then you need a better server and then increasing the memory limit would be acceptable. However if you just need to show this in a structured manner on your website then pagination is the way to go, or some sort of caching of the data and a background process that runs through the files. – Classified Sep 29 '17 at 13:04

2 Answers2

13

You can use RecursiveDirectoryIterator with a Generator if memory is a huge issue.

function recursiveDirectoryIterator($path) {
  foreach (new RecursiveIteratorIterator(new RecursiveDirectoryIterator($path)) as $file) {
    if (!$file->isDir()) {
      yield $file->getFilename() . $file->getExtension();
    }
  }
}

$start = microtime(true);
$instance = recursiveDirectoryIterator('../vendor');
$total_files = 0;
foreach($instance as $value) {
  // echo $value
  $total_files++;
}
echo "Mem peak usage: " . (memory_get_peak_usage(true)/1024/1024)." MiB";
echo "Total number of files: " . $total_files;
echo "Completed in: ", microtime(true) - $start, " seconds";

Here's what I got on my not-so-great laptop.

enter image description here

Andrei
  • 3,434
  • 5
  • 21
  • 44
0

I have a Unix background, so you could do (assuming you are running your PHP on Linux or Unix):

  • system call to: /bin/ls -c1 > files.list
  • you can make system call as complicated as you want to sort, parse, edit, ...
  • read files.list and display that
  • you could use file_get_contents() to read the file.
Nic3500
  • 8,144
  • 10
  • 29
  • 40