0

I created my code following following few tutorials. As my csv file has more than 300k rows I need to upload it as chunks, but unable to figure out how to do that.

Mainly followed this tutorial and several others including some laracast discussions https://itsolutionstuff.com/post/import-and-export-csv-file-in-laravel-58example.html

My Controller import function


    public function import()
    {
        Excel::import(new ReportImport, request()->file('file'));
        return view('dashboard');
    }

My ReportImport File

namespace App\Imports;

use App\Report;
use Maatwebsite\Excel\Concerns\ToModel;
use Maatwebsite\Excel\Concerns\WithHeadingRow;

class ReportImport implements ToModel, WithHeadingRow
{
    /**
     * @param array $row
     *
     * @return \Illuminate\Database\Eloquent\Model|null
     */
    public function model(array $row)
    {
        return new Report([
            'district' => $row['district'],
            'age' => $row['age'],
            'email' => $row['email'],
            'name' => $row['name'],
        ]);
    }
}


Illuminate \ Http \ Exceptions \ PostTooLargeException No message

  • With a `PostTooLargeException` it does not matter how you upload the file - except you break it up into several files and do several separate uploads, if that's what you mean by "chunks". – Tomalak Jun 25 '19 at 16:28
  • @Tomalak yes that is what I meant by chunks. is there any possibility to read the file content and upload them as several batches of rows. PS: Many thanks for your reply. – Sechan Kuchan Jun 25 '19 at 16:30
  • Ah, because there is the ["Chunked" transfer encoding](https://en.wikipedia.org/wiki/Chunked_transfer_encoding), which I wanted to rule out. It would probably not help you as you would still hit the server's maximum POST size. – Tomalak Jun 25 '19 at 16:44
  • Although you could increase the server's limit if you have control over the server. – Tomalak Jun 25 '19 at 16:45
  • do you have any validations for csv content while uploading – manu Jun 26 '19 at 06:04
  • If you have validations, Then it is better to move the process into a scheduler(cron). Which will be helpful to process large data without any timeout issues. just upload the file first and process all by a scheduler – manu Jun 26 '19 at 06:06
  • @Tomalak I've tried your suggestion of increasing server limit. It will give some room to handle around 30K by then. Even though it can't handle a huge file yet (i.e 300k rows). Memory gets exhausted even though there is more than enough resources left. – Sechan Kuchan Jun 26 '19 at 10:02
  • @manu I haven't use any specific ventilator for file content except the file type validation at the upload. Even though I'll try with a scheduler. Thank you for your suggestion. :) – Sechan Kuchan Jun 26 '19 at 10:04
  • How large is the file in MB? – Tomalak Jun 26 '19 at 10:46
  • @Tomalak it is around 35 MB – Sechan Kuchan Jun 27 '19 at 08:37
  • That should not be a problem. I've dug up some posts on the topic, you could see if you can use some of the info. https://haydenjames.io/understanding-php-memory_limit/ and https://stackoverflow.com/questions/16102809/how-to-upload-large-files-above-500mb-in-php (also see the linked posts and the duplicate) – Tomalak Jun 27 '19 at 09:33

0 Answers0