2

I am new in the amazon s3.In my cakephp3.0 application I need to copy file from one bucket to another. For that I installed AWS SDK using composer and in my controller constructor function I have initilized s3 client with credentials

public function initialize() {
    parent::initialize();
    $this->s3Client = new S3Client([
        'version' => 'latest',
        'region' => 'mumbai',
        'credentials' => [
            'key' => 'key',
            'secret' => 'secrect',
        ],
    ]);
}

And in my function to copy file

public function amazonCopy() {      
    $this->s3Client->registerStreamWrapper(); 
        $sourceBucket ="collect";
        $sourceKeyname = "testFolder";

       $s3Client->copyObject(array(
           'Bucket'     => "test1",
           'Key'        => testFolder,
           'CopySource' => "{$sourceBucket}/{$sourceKeyname}",
       )); 
    }
}

Now I am getting an error

Error executing "ListBuckets" on "https://s3.mumbai.amazonaws.com/"; AWS HTTP error: Error creating resource: [message] fopen(): php_network_getaddresses: getaddrinfo failed: Name or service not known [file] /var/www/mm/Src/php/operations/vendor/guzzlehttp/guzzle/src/Handler/StreamHandler.php [line] 312 [message] fopen(https://s3.mumbai.amazonaws.com/): failed to open stream: php_network_getaddresses: getaddrinfo failed: Name or service not known [file] /var/www/mm/Src/php/operations/vendor/guzzlehttp/guzzle/src/Handler/StreamHandler.php [line] 312 Aws\S3\Exception\S3Exception

I want to Create a zip file using of data in one bucket and copy it to other bucket without writing the file to disk using stream. What will be the cause of this error? Any help will be appreciated.

Now the zipping as stream is working properly when I used the zipstream library for that I used the code

$this->s3Client->registerStreamWrapper();
$zip = new ZipStream\ZipStream('example.zip');

$zip->addFile('hello.txt', 'This is the contents of hello.txt');     

$zip->addFileFromPath('happy_children-wide.jpg', 's3://mmvideo-test/Music/happy_children-wide.jpg');
$zip->addFileFromPath('txte', 's3://test/Music/test1.txt');
$zip->finish();

But again the problem is that I want to write the zip output on the fly to the bucket2. Because my files will contain videos of large size so it is not possible to store it in disk as temporary storage. it should write to the bucket as zip on the fly.

Senchu Thomas
  • 518
  • 3
  • 9
  • 24

1 Answers1

0

Tried but could not get it to work with streams only. Using streams only to download and zip into a tmp file.

$client = new S3Client(array(
    'region'      => 'ap-northeast-1',
    'version'     => 'latest',
    'credentials' => $provider
));

$client->registerStreamWrapper();

$objects = $client->listObjectsV2([
    'Bucket' => $fromBucket,
    'Prefix' => $fromPrefix
]);
$keys = array();
foreach ($objects['Contents'] as $o) {
    $keys[] = $o['Key'];
}

$tmp = tempnam(sys_get_temp_dir(), 'zipstream');
echo "creating temp zip: $tmp\n";
$ostream = fopen($tmp, 'w');
$zip = new ZipStream(null, array(ZipStream::OPTION_OUTPUT_STREAM => $ostream));

foreach($keys as $k){
    echo "$k\n";
    $istream = fopen("s3://$fromBucket/$k", 'r');
    $zip->addFileFromStream($k, $istream);
    fflush($ostream);
    fclose($istream);
}

$zip->finish();

$client->putObject(array(
    'Bucket'       => $toBucket,
    'Key'          => $toZip,
    'SourceFile'   => $tmp
));
at0mzk
  • 1,884
  • 14
  • 17
  • I want to Create a zip file using of data in one bucket and copy it to other bucket without writing the file to disk using stream. – Senchu Thomas Jan 17 '17 at 14:24
  • multiple files into one zip archive? or just compress each file? Looking at http://php.net/manual/en/class.ziparchive.php seems like you cant do it with streams. But you can compress single files using streams with zlib http://php.net/manual/en/ref.zlib.php – at0mzk Jan 18 '17 at 01:54
  • hi at0mzk I want to zip multiple files into one zip archive.It is poosible with zip stream but I want it to be write to s3bucket that I specified. And it shoul happen on the fly. Now it is possible to make zip as stream but not possible to write to s3bucket – Senchu Thomas Jan 18 '17 at 04:03
  • I think you will have to use a temp file for the zip anyway, on https://docs.aws.amazon.com/aws-sdk-php/v3/guide/service/s3-stream-wrapper.html#uploading-data see the Note: Because Amazon S3 requires a Content-Length header to be specified before the payload of a request is sent, the data to be uploaded in a PutObject operation is internally buffered using a PHP temp stream until the stream is flushed or closed. – at0mzk Jan 18 '17 at 06:05
  • tried it and getting CouldNotCreateChecksumException – at0mzk Jan 18 '17 at 06:18
  • It will be difficult for me to use temp because my zip will be almost 250gb or more than that so creating a temp of that much size will be difficult. Also it is time consuming – Senchu Thomas Jan 18 '17 at 07:11
  • it sounds like a use case for EFS https://aws.amazon.com/efs/. But maybe with multipart upload you can do it. So instead of the tmp file on disk you write to a php://temp buffer, read from it until you got 5MB, then do one part upload. – at0mzk Jan 18 '17 at 07:23
  • whether it possible to copy using multipart uploads http://docs.aws.amazon.com/aws-sdk-php/v2/guide/service-s3.html – Senchu Thomas Jan 20 '17 at 07:04