4

How can I find out the issue when I'm going to create zip file of 2GB file.

Error

file_get_contents(): content truncated from 2147483648 to 2147483647 bytes

Fatal error: Out of memory (allocated 2151677952) (tried to allocate 18446744071562067968 bytes) in

I am using dedicated server and already set memory_limit,max_execution_time,max_upload_filesize,max_post_size. But it is not working for me.Please check my code and let me know what i am doing wrong -

create new zip object

    $zip = new ZipArchive();

    # create a temp file & open it
    $tmp_file = tempnam('.','');
    $zip->open($tmp_file, ZipArchive::CREATE);

    # loop through each file
    foreach($files as $file){
        # download file
        $download_file = file_get_contents($file_path.'/'.$file);
        #add it to the zip
        $zip->addFromString(basename($file_path.'/'.$file),$download_file);
    }

    # close zip
    $zip->close();
    $zip_name = $last_seg.'.zip';
    # send the file to the browser as a download
    header("Content-disposition: attachment; filename=$zip_name");
    header('Content-type: application/zip');
    readfile($tmp_file);
Adrian Cid Almaguer
  • 7,815
  • 13
  • 41
  • 63
  • 1
    Try this solution provided here http://stackoverflow.com/questions/6282887/php-rendering-large-zip-file-memory-limit-reached this is a memory limit issue anf your file is too large. –  Apr 06 '15 at 09:23
  • Also check this question/answer http://stackoverflow.com/questions/5745255/php-aborting-when-creating-large-zip-file –  Apr 06 '15 at 09:24
  • possible duplicate of [Upper memory limit for PHP/Apache](http://stackoverflow.com/questions/4399138/upper-memory-limit-for-php-apache) – Zulu Apr 06 '15 at 10:17
  • Thank you so much, It could be duplicate question but after tried all solutions available here, i asked it again. I have tried to increase memory limit from WHM but not working :( –  Apr 06 '15 at 13:16
  • I am using dedicated server and already set memory_limit,max_execution_time,max_upload_filesize,max_post_size. But it is not working for me. –  Apr 08 '15 at 11:34
  • 1
    the message says that's you're trying to allocate 18446744 TeraBytes. I think you should not only adjust memory limits but also buy a lot more RAM... – Paolo Apr 08 '15 at 12:02
  • Like others said before you should use streams for creating the zip files. Another option is that you use an external binary for compressing the actual files instead of PHP zip. Anyway, be careful creating archives for files bigger than 2GB. Some zip implementations might have problems to decompress it. – Gerd K Apr 08 '15 at 14:00

3 Answers3

5

I change $zip->addFromString() to $zip->addFile() because you don't need read the content file to add the file, I test your code with 3 films and don't works (I had the same error) but when I use $zip->addFile() all go ok and I could download the zip file with 3gb.

I need to use set_time_limit(0);

If you want test this code only change the values of:

$files //Array of files name $file_path //Path where your files ($files) are placed $last_seg //The name of your zip file

<?php

    set_time_limit(0);

    $files = array('Exodus.mp4', 'the-expert.webm', 'what-virgin-means.webm');
    $file_path = 'zip';
    $last_seg = 'test';

    $zip = new ZipArchive();

    # create a temp file & open it
    $tmp_file = tempnam('.','');
    $zip->open($tmp_file, ZipArchive::CREATE);

    # loop through each file
    foreach($files as $file){
        $zip->addFile($file_path.'/'.$file, $file);
    }

    # close zip
    $zip->close();
    $zip_name = $last_seg.'.zip';
    # send the file to the browser as a download
    header("Content-disposition: attachment; filename=$zip_name");
    header('Content-type: application/zip');
    readfile($tmp_file);

?>

You can read more at:

http://php.net/manual/en/ziparchive.addfile.php

Adrian Cid Almaguer
  • 7,815
  • 13
  • 41
  • 63
4

You'll never been able to allocate more memory than PHP_INT_MAX. So maybe the linux x64 versions of PHP can handle this if the file_gets_content isn't internally limited to a signed int 32bits, but on windows or on a 32bits system you have no chance to achieve this without streaming.

Something like this might work: (not tested yet)

$fr = fopen("http://...", "r");
$fw = fopen("zip://c:\\test.zip#test", "w");

while(false !== ($buffer = fread($fr, 8192)))
{
  fwrite($fw, $buffer, strlen($buffer));
}

fclose($fr);
fclose($fw);

Ok my bad apparently PHP do not provide the mode "+w" for a zip stream... Your last options will be then, writing the whole file in a temp file (by streaming it like i did, no file_get_contents) before giving it to an external program (with a system() or popen call...) or using another compression format (apparently php support write stream operation for zlib ant bzip2) or use an external library for php.

mathieu
  • 477
  • 3
  • 9
  • thank you mathieu, so i need to change my code?do you know what change i should make here? or any link where i can find reference? –  Apr 08 '15 at 11:49
  • Basically, you need to open a zip stream to write in it, and then read the input file repetitively with a buffer that php can handle... I'm not sure if it's possible without external libraries. – mathieu Apr 08 '15 at 11:57
-3

try to put this line in the beginning of your code:

ini_set("memory_limit", -1);

Refer to this question Fatal error: Out of memory (allocated 1134559232) (tried to allocate 32768 bytes) in X:\wamp\www\xxx

Community
  • 1
  • 1
Manee.O.H
  • 589
  • 8
  • 19
  • ok thank you so much for giving your precious time,I'll try this.I've already tried memory_limit = -1 on WHM, but it didn't work. –  Apr 06 '15 at 13:12
  • -1 The size of 18446744071562067968 will exeed the system memory. Add more memory to be able to read a file into the memory is always a bad idea. – Christian Kuetbach Apr 08 '15 at 12:10