I'm trying to implement a simple chunked file uploads for large files using the jQuery-File-Upload plugin. I cloned the GitHub-repo into a folder and changed my main.js
to the following:
$('#fileupload').fileupload({
maxChunkSize: 10000000,
add: function (e, data) {
var that = this;
$.getJSON('server/php/', {file: data.files[0].name}, function (file) {
data.uploadedBytes = file && file.size;
$.blueimp.fileupload.prototype.options.add.call(that, e, data);
});
}
});
as described in the docs. This works fine for files up to 2GB (I can see the chunking in the console and on the server), yet when I try to upload files larger than 2.1GB the following will happen: The last chunk of data will be correctly appended to the existing file myUpload.file
and then the script will start writing a new file myUpload.file (1)
, starting from scratch, no errors will be thrown, both client- and server-side. The same happens when the second file has reached the size of 2.1GB. The file's progress counter will be reset, whereas the overall progress counter will overflow and tell me that I'm 142% (and counting) done.
I'm running PHP 5.3.5 on the server side.
I already tried fiddling with the settings for max_post_size
, upload_max_file_size
and memory_limit
like described in the plugin's FAQ and this question on SO but didn't have any success. I also found this issue on GitHub that sounds similar, but it got closed without any real input.
Other than fiddling with the php.ini
settings I am pretty clueless what to do about this.