I'm pulling my hair out. Using plupload with chunking on a php backend. Chunked (or non chunked for that matter) goes fine until I use files > 100mb.
I'd like to be able to use files until 1Gb. Keep getting the error, and then the retries kick in and percentage is at 0% again and restarting. There is also no preflight OPTION request. I even made a quick NodeJS backend by using node-pluploader, that handles chunks, but same problem and errors there (although that did send preflight OPTION).
Any advice would be greatly appreciated
Additional Apache directives
LimitRequestBody 0
PHP settings
file_uploads On
max_file_uploads 20
max_execution_time 3600
memory_limit 640M
post_max_size 600M
upload_max_filesize 500M
js
var uploader = new plupload.Uploader({
runtimes : 'html5,silverlight,html4',
browse_button : 'pickfiles',
url : '<?php echo SITE_URL; ?>admin?sub=upload',
silverlight_xap_url : '<?php echo SITE_URL; ?>lib/Moxie.xap',
multi_selection : false,
max_retries: 10,
filters : {
max_file_size : '1024mb',
chunk_size: '10mb',
},
init: {
FilesAdded: function(up, files) {
plupload.each(files, function(file) {
$('#document-file').html(file.name);
});
},
UploadProgress: function(up, file) {
$('#spinner-progress').html( " " + file.percent + "%" );
},
Error: function(up, err) {
console.warn("Error #" + err.code + ": " + err.message);
},
UploadComplete: function(up, files) {
console.log(files[0]);
}
}
});
uploader.init();
uploader backend
<?php
require_once(LIB.SLASH."PluploadHandler.php");
PluploadHandler::no_cache_headers();
PluploadHandler::cors_headers();
if (!PluploadHandler::handle(array(
'tmp_dir' => UPL_TMP_PATH,
'target_dir' => UPL_TMP_PATH
))) {
die(json_encode(array(
'OK' => 0,
'error' => array(
'code' => PluploadHandler::get_error_code(),
'message' => PluploadHandler::get_error_message()
)
)));
} else {
die(json_encode(array('OK' => 1)));
}
I'm using an unchanged plupload-handler-php example