First of all sorry for my english.
I am trying to upload large file with laravel. I am also understand, that I need to stream this file to my local storage. I also use blueimp/jquery-file-upload which can make a chunks. Nevertheless, should I use chunks on client side, if I stream file on server side?
All what I need, it's upload a large file or even a few files with progress bar. Also would be nice if the uploading will net eat all my RAM.
All what I tried is a default set of blueimp/jquery-file-upload plugin Which looks like
// Initialize the jQuery File Upload widget:
$('#fileupload').fileupload({
// Uncomment the following to send cross-domain cookies:
//xhrFields: {withCredentials: true},
url: 'upload',
//maxChunkSize: 10000000
});
And laravel controller looks like
$images = $request->file('files');
foreach ($images as $file)
{
$extension = $file->getClientOriginalExtension();
$imageName = $file->getClientOriginalName();
$disk = Storage::disk('local');
$disk->put("$imageName.$extension", fopen($file, 'r+'));
}
When I want chunk file on client side, I put to maxChunkSize: 1024 *1024 *1024 *1024 *1024 *1024 * 10
And change server side like
$images = $request->file('files');
foreach ($images as $file)
{
$file->store(
'f/', 'local'
);
}
Its save files which less then 100MB, If try to upload a bit large files I have an error -- Payload Too Large -- This error have next message in debug console
POST Content-Length of 445883220 bytes exceeds the limit of 134217728 bytes in
But why? Isn't I use chunks? I do and it's not works properly.