Reading your question I first started to create a chunked zip packer, to do just what you asked. It would generate an array with links to a webpage, which you had to open in sequence to create a zip file. While the idea was working, I quickly realised its not realy needed.
A memorylimit is only a problem when the packer tries to open the entire file at once, and then zip it. Luckily a few smart people already figured that its easier to do it in chunks.
Asbjorn Grandt is one of those people who created this zip class which is very easy to use and does what you need.
First I created a very large file. It will be 500MB in size with various letters in it. This file is way to back to handle at once, which results in the fatal memory limit errors.
<?php
$fh = fopen("largefile.txt", 'w');
fclose($fh);
$fh = fopen("largefile.txt", 'ab');
$size = 500;
while($size--) {
$l = chr(rand(97, 122));
fwrite($fh, str_repeat($l, 1024*1024));
}
fclose($fh);
?>
And to use the zip class we would do:
<?php
include('zip.php');
$zip = new Zip();
$zip->setZipFile("largefile.zip");
//the firstname is the name as it will appear inside the zip and the second is the filelocation. In my case I used the same, but you could rename the file inside the zip easily.
$zip->addLargeFile("largefile.txt", "largefile.txt");
$zip->finalize();
?>
Now the large zip is created in just a few seconds on my server and the result is a 550KB file.
Now if for some weird reason you still need to do this in several web request, let me know. I still have the code I started with to do just that.