0

First of all sorry for my english.

I am trying to upload large file with laravel. I am also understand, that I need to stream this file to my local storage. I also use blueimp/jquery-file-upload which can make a chunks. Nevertheless, should I use chunks on client side, if I stream file on server side?

All what I need, it's upload a large file or even a few files with progress bar. Also would be nice if the uploading will net eat all my RAM.

All what I tried is a default set of blueimp/jquery-file-upload plugin Which looks like

// Initialize the jQuery File Upload widget:
    $('#fileupload').fileupload({
        // Uncomment the following to send cross-domain cookies:
        //xhrFields: {withCredentials: true},
        url: 'upload',
        //maxChunkSize: 10000000
    });

And laravel controller looks like

$images = $request->file('files');
        foreach ($images as $file)
        {
            $extension  = $file->getClientOriginalExtension();
            $imageName = $file->getClientOriginalName();
            $disk = Storage::disk('local');
            $disk->put("$imageName.$extension", fopen($file, 'r+'));
        }

When I want chunk file on client side, I put to maxChunkSize: 1024 *1024 *1024 *1024 *1024 *1024 * 10

And change server side like

$images = $request->file('files');
        foreach ($images as $file)
        {
            $file->store(
                'f/', 'local'
            );
        }

Its save files which less then 100MB, If try to upload a bit large files I have an error -- Payload Too Large -- This error have next message in debug console

POST Content-Length of 445883220 bytes exceeds the limit of 134217728 bytes in

But why? Isn't I use chunks? I do and it's not works properly.

Squaddie
  • 3
  • 2
  • Well, the code you've provided has `maxChunkSize` commented out, so if it's supposed to do something it's not. – miken32 Apr 10 '19 at 22:22
  • Yes it is. But when I use chunks, of course I uncomment it and it's still not working – Squaddie Apr 10 '19 at 22:34
  • 2
    `1024 *1024 *1024 *1024 *1024 *1024 * 10` is 10 _exabytes_. 10GB would be `1024 *1024 *1024 * 10`. – Sammitch Apr 10 '19 at 22:36
  • Yes it is, but it's doesn't matter 1024*1024*1024*10 or more. Still i have POST Content-Length of 445883220 bytes exceeds the limit of 134217728 bytes in – Squaddie Apr 10 '19 at 22:40
  • Even if the file is split it's still sent over a single multi-part request so normal POST size limits would apply. You can refer to https://stackoverflow.com/questions/6135427/increasing-the-maximum-post-size for how to increase that. – apokryfos Apr 10 '19 at 22:42
  • I mean, if I chunk it, the server gets small parts of one big file, right? – Squaddie Apr 10 '19 at 22:42

0 Answers0