2

I am trying to download the large files from the AWS S3 with the getObject method. But for large files the page gets down. How i can use the range to download the file completely in parts?

function DownloadContent($keyName) {
    $store = array();
    require(__DIR__ . '/../config/s3Config.php');

    if (!$this->s3Client) {
        $this->s3Client = S3Client::factory(array(
                    'key' => $store['s3']['key'],
                    'secret' => $store['s3']['secret']
        ));
    }

    foreach ($keyName as $key => $row) {
        $varFileName = explode('/', $row);
        $bucket = 'my-bucket-name';
        $result = $this->s3Client->getObject(array(
            'Bucket' => $bucket,
            'Key' => $row
        ));

        header("Content-Type: {$result['ContentType']}");
        header("Content-Disposition: attachment; filename=\"{$varFileName[2]}\"");
        header('Expires: 0');header("X-Sendfile: $varFileName[2]");
        echo $result['Body'];
    }
}
  • 1
    What actual error are you getting? Are you running out of memory in PHP? The files have to be loaded into PHP, taking up memory, so you need more than enough memory available to cope with them. – samlev Dec 03 '15 at 17:13
  • The page is getting the error ERR_INVALID_RESPONSE, i think that the large files response is going through. I want to use the range attribute for the concurrent downloads. – Andy Carroll Dec 03 '15 at 17:17
  • 1
    Is there anything in your PHP Error logs? – samlev Dec 03 '15 at 17:18
  • When you talk about "large" you can't be completely vague like this. The last time I had a "large file problem" all the answers referenced sizes of 10 to 100 MB, whereas I was looking at 300 to 600 **G**B. "Large" is very relative. – Sammitch Dec 03 '15 at 18:22
  • my file sizes are also in GB's that needs to be downloaded. – Andy Carroll Dec 04 '15 at 10:38
  • Did you find a solution for this issue? – Diego Ponciano Oct 13 '18 at 17:28

1 Answers1

1

I know that it's a rather old question, there doesn't seem to be an answer to it though.

$bucket_name = 'my-bucket';
$object_key = 'object-key';

// initialize the S3 client

$client = new S3Client(...);

// register the stream

$client->registerStreamWrapper();

// get the file size

$file_size = filesize('s3://'.$bucket_name.'/'.$object_key);

// get the meta data

$meta_data = $client->headObject([
    'Bucket' => $bucket_name,
    'Key' => $object_key
]);

// get the offset & length

$offset = 0;
$length = $file_size;
$partial_content = false;

if(isset($_SERVER['HTTP_RANGE'])) {
    // the first 500 bytes: bytes=0-499
    // the second 500 bytes: bytes=500-999
    // all bytes except for the first 500 until the end of document: bytes=500-
    // the last 500 bytes of the document: bytes=-500

    preg_match('{bytes=(\d+)?-(\d+)?(,)?}i', $_SERVER['HTTP_RANGE'], $matches);

    if(empty($matches[3])) {
        $partial_content = true;
        $offset = (!empty($matches[1])) ? intval($matches[1]) : 0;
        $length = (!empty($matches[2])) ? intval($matches[2]) : $file_size;
        $length -= $offset;
    }
}

// set the headers for partial content

if($partial_content === true) {
    header('HTTP/1.1 206 Partial Content');
    header('Content-Range: bytes ' . $offset . '-' . ($offset + $length - 1) . '/' . $file_size);
}

// set the regular HTTP headers

header('Content-Type: '.$meta_data['ContentType']);
header('Content-Length: '.$file_size);
header('Content-Disposition: attachment; filename="'.basename($object_key).'"');
header('Accept-Ranges: bytes');

// open a S3 stream to the file

$s3file = fopen('s3://'.$bucket_name.'/'.$object_key, 'rb');

if(!$s3file) {
    throw new Exception('Error opening S§ stream for reading!');
}

// open an output stream

$out = fopen('php://output', 'wb');

// copy data from the S3 stream to the output stream

fseek($s3file, $offset);
stream_copy_to_stream($s3file, $out, $length);

// close the streams

fclose($out);
fclose($s3file);

Just FYI: I've excluded the multirange options ...

flowolf
  • 11
  • 3