4

100 MB file --> 10 ZIP calls(10 MB zip per call) --> 1 ZIP file

I should initiate 10 calls to add a single 100 MB file into Zip file (say 10 MB Zipping per call).

Problem is we have a system with memory and time limit(which will not process more then 10 to 15MB for a call).

So zipping a big file with many calls is the basic idea.

I am ready to provide more data if required.

itsoft3g
  • 505
  • 1
  • 9
  • 27

2 Answers2

3

Have you ever tried out PECL Zip before?

Just zipped two files with the following code without any memory limitation problems. Time limit may be reset. My environment: memory_limit of 3MB and and max_execution time of 20 Sec.

<?php
set_time_limit(0);
$zip = new ZipArchive();
$zip->open('./test.zip', ZipArchive::CREATE);
$zip->addFile('./testa'); // 1.3 GB
$zip->addFile('./testb'); // 700mb
$zip->close();

Note: set_time_limit() will not work on php < 5.4 with save_mode=on


Another approach could be to create the zip in a background process. This avoids possible memory_limit issues.

Here is an example: http://pastebin.com/7jBaPedb

Usage:

try {
  $t = new zip('/bin', '/tmp/test.zip');
  $t->zip();

  if ($t->waitForFinish(100))
    echo "Succes :)";
  else
    echo $t->getOutput();
} catch ($e) {
  echo $e->getMessage();
}

Instead of waiting until the process has ended, you could write the pid in a database and serve the file if it has finished...

jgb
  • 1,206
  • 5
  • 18
  • 28
  • I am looking for solution which should work even in any shared hosting. 3MB limit which you applied might not be worked because, i feel its impossible to zip 1.3GB with 3MB memory. Please note Webserver(like apache) also will have memory and time limit. As per the shared hosting limitations you cannot get pid. – itsoft3g Sep 18 '13 at 13:22
  • @itsoft3g Give the example above a try. It works well. I have been tested the code before posting it here. PECL Zip does exactly what you're asking for (read and write files in chunks to not affect memory limits). – jgb Sep 18 '13 at 13:46
  • But i have time limitation, so i want to complete in many call. – itsoft3g Sep 18 '13 at 14:05
2

Reading your question I first started to create a chunked zip packer, to do just what you asked. It would generate an array with links to a webpage, which you had to open in sequence to create a zip file. While the idea was working, I quickly realised its not realy needed.

A memorylimit is only a problem when the packer tries to open the entire file at once, and then zip it. Luckily a few smart people already figured that its easier to do it in chunks.

Asbjorn Grandt is one of those people who created this zip class which is very easy to use and does what you need.

First I created a very large file. It will be 500MB in size with various letters in it. This file is way to back to handle at once, which results in the fatal memory limit errors.

<?php
$fh = fopen("largefile.txt", 'w');
fclose($fh);
$fh = fopen("largefile.txt", 'ab');
$size = 500;
while($size--) {
  $l = chr(rand(97, 122));
  fwrite($fh, str_repeat($l, 1024*1024));
}
fclose($fh);
?>

And to use the zip class we would do:

<?php
include('zip.php');

$zip = new Zip();
$zip->setZipFile("largefile.zip");
//the firstname is the name as it will appear inside the zip and the second is the filelocation. In my case I used the same, but you could rename the file inside the zip easily.
$zip->addLargeFile("largefile.txt", "largefile.txt");
$zip->finalize();
?>

Now the large zip is created in just a few seconds on my server and the result is a 550KB file.

Now if for some weird reason you still need to do this in several web request, let me know. I still have the code I started with to do just that.

Hugo Delsing
  • 13,803
  • 5
  • 45
  • 72
  • Sorry for the huge delay. You would have compressed a text file of 500MB thats why you got it around 550KB(compress file). So this would have taken very less time. I want it in several web request and also try some media file of 500 MB(a movie may be) – itsoft3g Sep 18 '13 at 13:13