5

I'm using phps ZipArchive class to create a zip file of a directory. The code works fine for almost all directories except two, which happen to be the largest ones (the largest directory currently contains 51 files with a total of 175MB). When I'm running the code for these directories a temporary file ('filename.zip.[RandomLettersAndNumbers]', e.g. 'filename.zip.riaab4') will be created with a size of 67,108,864 bytes and the script throws an internal server error (500).

What I've tried so far (most of it is still visible in the source):

  • Increase memory_limit
  • Increase max_execution_time
  • Check the php error log: error log is empty
  • Try to find the exact line where the error occurs: the code is executed up to $zip_archive->close();

Source:

//Temporary (debugging) START
ini_set("log_errors", 1);
ini_set("error_log", "php-error.log");
error_log("Hello errors.");
ini_set('memory_limit','512M');
set_time_limit(180);
//phpinfo();
//Temporary (debugging) END

//Get selected directory
// --> Code removed
$result['directory'] = 'directory1';    

//Create .zip file
$zip_archive = new ZipArchive();
$ZIP = 'files/'.$result['directory'].'.zip';

if($zip_archive->open($ZIP, ZipArchive::CREATE | ZIPARCHIVE::OVERWRITE) !== TRUE) {
    echo 'Error! Error while creating ZIP Archive.';
    die;
}

foreach(new DirectoryIterator('files/'.$result['directory']) as $fileInfo) {
    if($fileInfo->isDot()) 
        continue;
    if(file_exists($fileInfo->getPath())) 
        $zip_archive->addFile('files/'.$result['directory'].'/'.$fileInfo->getFilename(), $fileInfo->getFilename());
}

//Temporary (debugging) START
echo '<br><br><br>';
var_dump($zip_archive->close());
// ## The following lines are not executed!! ##
echo '<br><br><br>';
var_dump($zip_archive->getStatusString());
echo '<br><br><br>';
//Temporary (debugging) END

if($zip_archive->numFiles > 0 && $zip_archive->close() == true)
    $FILE = $ZIP;
else {

}

Am I missing something? If there's anything else I can try please let me know. Thanks.

EDIT: I've got a more specific question: Why do the temporary zip files of the affected directories all have a file size of 67,108,864 bytes? Is this related to a maximum file size set by php/the server or can this be explained with the zip standard/ZipArchive class?

MatthewS
  • 51
  • 1
  • 3
  • Apache or Nginx? I think, PHP work, but because server dont wait for PHP is done - you see error 500 – KoIIIeY Jan 11 '16 at 17:19
  • so, check server error, not PHP – KoIIIeY Jan 11 '16 at 17:21
  • I'm using a web hosting service, therefore I don't have access to the actual web server. The server error log is empty (except some unrelated "Premature end of script headers" errors). According to customer service, there is no maximum file size (it's possible to zip the directory on my local computer an upload the zip file on the server) and no maximum execution time. – MatthewS Jan 11 '16 at 17:59
  • This may be due to a limitation of the number of file handles. Can you log `$zip_archive->numFiles()`? Try to zip 100+ small files and see if error 500 gets thrown. A very good description of this problem can be found as a comment in the php manual http://php.net/manual/en/ziparchive.addfile.php#112044 – maxhb Jan 11 '16 at 20:05
  • Interesting idea (thanks!), but sadly it does not solve the issue. The largest directory contains 51 files. I've changed `$zip_archive->addFile(...)` to `$zip_archive->addFromString(...)` which works fine (resulting in a zip file of 51x 1KB files), so the number of files should not be an issue. After implementing the suggested code and closing the ZipArchive every 5 files I still get an internal server error as soon as the file size of the temporary zip file reaches 67,108,864 bytes. – MatthewS Jan 11 '16 at 20:47
  • Try to create archive not by code, but by console :D exec('zip command here'); – KoIIIeY Jan 12 '16 at 09:08
  • exec creates a zip file of 67,108,864 bytes (64 MB) which is probably damaged. Is it possible to determine if the maximum file size is set by php or web server settings? `memory_limit` is set to 512M (local value) / 128M (master value). – MatthewS Jan 12 '16 at 16:49
  • try this - http://stackoverflow.com/questions/17818080/fatal-error-allowed-memory-size-of-67108864-bytes-exhausted-tried-to-allocate – KoIIIeY Jan 12 '16 at 18:32

1 Answers1

0

Try closing the ZIP archive and reopening it after every 5 or so files are added to the archive. I assume this solution solves a memory limit issue.

Credit to MatthewS for this solution.

Seth
  • 16
  • 1