1

I have been able to upload files to my bucket using the below code from the AWS PHP SDK. The problem is when I try to upload files that are larger than 15 mb, the script throws an error. Otherwise it works as expected. Any Ideas what I'm doing wrong? Thanks in advance.

require $dir . 'aws/aws-autoloader.php';
    $s3Client = new Aws\S3\S3Client(array(
    'version' => 'latest',
    'region' => 'us-west-2',
    'credentials' => array(
        'key' => 'KEY',
        'secret' => 'SECRET',
    )
));
$result = $s3Client->putObject(array(
    'Bucket' => 'eot-resources',
    'Key' => $org_id."_".$_FILES["fileToUpload"]["name"],
    'SourceFile' => $_FILES["fileToUpload"]["tmp_name"],
    'Body'        => new GuzzleHttp\Psr7\Stream(fopen($_FILES["fileToUpload"]["tmp_name"], 'r')),
    'ACL' => 'public-read',
    'StorageClass' => 'REDUCED_REDUNDANCY',
    'Metadata' => array(
        'Foo' => 'abc',
        'Baz' => '123'
    )
));
echo "URL: ".$result['ObjectURL'] . "<br>";

I get the following error when trying to upload a large file bigger than 15mb. Other than that it works.

Warning: fopen(): Filename cannot be empty in /Users/xxx/Code/xxx/wp-content/plugins/xxx/parts/part-upload_file.php on line 37.

Line 37 reads..

'Body'        => new GuzzleHttp\Psr7\Stream(fopen($_FILES["fileToUpload"]["tmp_name"], 'r')),

Any help/advice/tips would be appreciated.

Tommy Adeniyi
  • 325
  • 3
  • 16
  • I've also tried the php multipart uploader and the Javascript client side uploader and I get the same result. It either errors or times out when the files are over 10-15mb. Help!! – Tommy Adeniyi Mar 17 '17 at 20:37
  • Have a look this one: http://stackoverflow.com/questions/2184513/php-change-the-maximum-upload-file-size – sowi Mar 17 '17 at 20:48

1 Answers1

0

I suspect your server has a limit on the size of file that can be uploaded. Check using phpinfo() and look for post_max_size and upload_max_filesize. Assuming that's the problem then you can increase the limit.

Chris
  • 4,672
  • 13
  • 52
  • 93