The best solution from my experience is to use aws-sdk-php to access objects on S3 via an s3client with registerStreamWrapper() enabled. Then use fopen to stream objects from S3 and feed that stream directly to ZipStream's addFileFromStream() function and let ZipStream take it from there. No ZipArchive, no massive memory overhead, no creating zip at server or duplicating files from S3 on the web server to subsequently use to stream a zip.
So:
//...
$s3Client->registerStreamWrapper(); //required
//test files on s3
$s3keys = array(
"ziptestfolder/file1.txt",
"ziptestfolder/file2.txt"
);
// Define suitable options for ZipStream Archive.
$opt = array(
'comment' => 'test zip file.',
'content_type' => 'application/octet-stream'
);
//initialise zipstream with output zip filename and options.
$zip = new ZipStream\ZipStream('test.zip', $opt);
//loop keys useful for multiple files
foreach ($s3keys as $key) {
// Get the file name in S3 key so we can save it to the zip
//file using the same name.
$fileName = basename($key);
//concatenate s3path.
$bucket = 'bucketname';
$s3path = "s3://" . $bucket . "/" . $key;
//addFileFromStream
if ($streamRead = fopen($s3path, 'r')) {
$zip->addFileFromStream($fileName, $streamRead);
} else {
die('Could not open stream for reading');
}
}
$zip->finish();
And if you are using ZipStream in a Symfony controller action, see this answer too: https://stackoverflow.com/a/44706446/136151