1

Ok, I know how this question has been asked and all. But, heres the thing.

  1. I'm already using ini_set('memory_limit', '400M');
  2. The file I'm trying to transfer (to Amazon S3) is 245MB
  3. The error msg is weird, says allowed mem of 400MB exhausted when it was trying to allocate 239MB.. isnt that the other way round?

The script I'm using is a library out there, to communicate with the Amazon S3

Help please!

EDIT
Ok heres the code, as you can see I'm not doing much, its all about the script I'm using.. That is here: http://belgo.org/backup_and_restore_to_amazo.html

ini_set('memory_limit', '400M');
require 'lib/s3backup.php';
$bucket = 'thebucketname';
$bucket_dir = 'apts';
$local_dir = "/home/apartmen/public_html/transfer/t/tr";
$s3_backup = new S3_Backup;
$s3_backup->upload_dir( $bucket, $bucket_dir, $local_dir );
Torrrd
  • 11
  • 1
  • 3
  • 2
    You will need to show the script. – Pekka Oct 07 '10 at 12:51
  • 1
    Are you transfering one file or multiple files in a loop? – Gordon Oct 07 '10 at 12:55
  • Show your code. You shouldn't need to use 400+mb to transfer a 245 mb file... You shouldn't need to read the file in memory (you should be able to just copy stream to stream)... So, show you code so we can try to help you to figure out why it's failing... – ircmaxell Oct 07 '10 at 12:56
  • I'm using the script here to backup a file to Amazon S3 -> http://belgo.org/backup_and_restore_to_amazo.html I successfully managed to backup my entire server, no problem, but then i try to transfer this heavyweight file, just 1 file and it messes up – Torrrd Oct 07 '10 at 12:58
  • I think that it is problem not in memory_limit. This problem in your script. I occured this error where my script was generating perpetual array. Or it was perpetual loop. – Alex Pliutau Oct 07 '10 at 12:58
  • 3
    It's a design flaw in those two classes (the backup one, and the s3 one).. Rather than using streams to pass data around (so that you can simply do [`stream_copy_to_stream`](http://us.php.net/manual/en/function.stream-copy-to-stream.php) so there's no need to read the file into memory at all), they pass all the data around as strings. So using those classes (without significant refactoring) you're stuck using boat loads of memory... – ircmaxell Oct 07 '10 at 13:03
  • What @ircmaxell says. You need better upload classes – Pekka Oct 07 '10 at 13:08
  • 1
    A quick fix might be to insert `unset($data)` after the call to `$s3->putObject( $name, $data, NULL, NULL, NULL, $metadata );` in the S3Backup script. That would tell the GC to free the memory and will eventually lower the memory consumption during the `while` loop. See http://stackoverflow.com/questions/2617672/how-important-is-it-to-unset-variables-in-php/2617786#2617786 – Gordon Oct 07 '10 at 13:15
  • Thanks everyone, esp ircmaxell. Gordon, didn't work, but thats prolly cuz theres only one file. Maybe I'll just make it chop the file to smaller pieces. – Torrrd Oct 07 '10 at 19:33

2 Answers2

4

"allowed mem of 400MB exhausted when it was trying to allocate 239MB.." means that PHP was trying to allocate an additional 239MB of memory that (when added to the memory already allocated to the script) pushed it over the 400MB limit.

Mark Baker
  • 209,507
  • 32
  • 346
  • 385
  • 1
    Ok, now I made it 700M but got "Internal Server Error".. Just checked, no I didn't crash the server :P The file is just 245M, why does it need all that memory – Torrrd Oct 07 '10 at 12:54
  • I'd guess it needs that much memory because it's loading the entire file into memory rather than reading it in "chuncks" or using apull parser if it's an XML file – Mark Baker Oct 07 '10 at 12:57
  • @Tor: If you got an "Internal Server Error", you might want to check your logs to see why... – ircmaxell Oct 07 '10 at 13:05
0

The AWS SDK for PHP has an AmazonS3 class that can stream a local file up to S3.

http://docs.amazonwebservices.com/AWSSDKforPHP/latest/#m=AmazonS3/create_object

The parameter you need to pay attention to is "fileUpload".

Ryan Parman
  • 6,855
  • 1
  • 29
  • 43