10

So I want users to be able to upload big files without having to worry about the post max size values. The alternative is using PUT and send a file as raw data. When using jquery I can do this:

var data = new FormData();
  jQuery.each($('#file_upload')[0].files, function(i, file) {
  data.append('file-'+i, file);
});
$.ajax({
  url: 'upload.php?filename=test.pdf',
  data: data,
  cache: false,
  contentType: false,
  processData: false,
  type: 'PUT',
});

In PHP I can do this:

$f = fopen($_GET['filename'], "w");
$s = fopen("php://input", "r");

while($kb = fread($s, 1024))
{ 
  fwrite($f, $kb, 1024); 
}
fclose($f);
fclose($s);
Header("HTTP/1.1 201 Created");

I am not doing:

$client_data = file_get_contents("php://input");

Since putting the whole file into a variable will surely fill up all memory when uploading huge files.

The thing I cannot figure out is how to write the file data without the form boundaries. Right now it writes at the top of the file something like this:

------WebKitFormBoundaryVz0ZGHLGxBOCUVQG
Content-Disposition: form-data; name="file-0"; filename="somename.pdf"
Content-Type: application/pdf

and at the bottom something like this:

------WebKitFormBoundaryVz0ZGHLGxBOCUVQG--    

So I need to parse the data. But for that I need to read the whole data stream into memory and with large video files I don't want to do that. I did read something about maybe creating a php://temp stream. But no luck yet with that. How can I write just the content to a file, without the boundary header? And without first pumping all the data into a variable?

user1494552
  • 163
  • 8

3 Answers3

1

Maybe a combination of fgets to stop reading at a newline and checking for the boundaries:

while($kb = fgets($s, 1024))
{ 
    if(strpos($kb, '------') === false) //or !== 0 for first position
    {
        fwrite($f, $kb, 1024); 
    }
}
AbraCadaver
  • 78,200
  • 7
  • 66
  • 87
  • But how to be sure that '-----' is included into the same chunk? What if '---' is included in the end of first chunk and '---' - it's a start of the second one? Or it can not be the case? – Tamara Feb 09 '15 at 17:22
  • `fgets` reads to the end of the line `\n`, so the - would be at the beginning of the next line. – AbraCadaver Feb 09 '15 at 18:06
  • If that is the case, what will happen with the following headers? They will damage the file as well. – DaGhostman Dimitrov Feb 10 '15 at 14:57
1

You can use this (there are many like it). It supports chunked uploads which means you won't hit any post/file max sizes as long as each upload chunk is less than the post max size.

It also includes the PHP code you would need on the server side.

aljo f
  • 2,430
  • 20
  • 22
0

There is no need to recreate the wheel. Just use POST and change PHP's config to bigger limits. Those limits can also be set on a per directory / host basis.

Using .htaccess or your apache.conf

php_value upload_max_filesize 10G
php_value post_max_size 10G 

It is also a good idea to adjust other limits, like max_input_time.

Don't forget to relocated the received file using move_uploaded_file to avoid any extra work.

rodrigovr
  • 454
  • 2
  • 7