0

I am using https://github.com/rap2hpoutre/fast-excel for importing many large Excel files in a loop.

But the memory is getting exhausted during the process.

Below is the error:

2020-08-20 15:16:28 Saved 13867 records from file '/Users/akshaylokur/Work/GitHub/my-app/storage/app/my_excel.xlsx'.
PHP Fatal error:  Allowed memory size of 134217728 bytes exhausted (tried to allocate 20480 bytes) in /Users/akshaylokur/Work/GitHub/my-app/vendor/laravel/framework/src/Illuminate/Database/Connection.php on line 338

And giving below code snippet:

foreach ($files as $fileName) {

 $objs = (new FastExcel)->import($fileName, function ($line) {
                return MyModel::create([
                    'col_a' => $line['col_a'],
                    'col_b' => $line['col_b'],
                      .
                      .
                      .
                      .
                 ]);
      });
}

Appreciate any clues, thanks

Akshay Lokur
  • 6,680
  • 13
  • 43
  • 62
  • You can do this job in the background and chunk the file to smaller pieces and after the process is finished you can send a notification to the user. – Tohid Dadashnezhad Aug 20 '20 at 19:26
  • @TohidDadashnezhad Any idea how can I chunk with FastExcel? – Akshay Lokur Aug 21 '20 at 01:23
  • As I read the source code of this package it uses php generators over the excel sheets and their rows. However it puts all the result, I mean every model that is created in the array named $collection. So I think this may couse the error as you mentioned. What I recommend you to test is to override the https://github.com/rap2hpoutre/fast-excel/blob/master/src/Importable.php importSheet method and rather than using MyModel::create just return a raw array and in the overridden method check if $collection has 100 rows then apply a batch insert and empty the $collection. – Tohid Dadashnezhad Aug 21 '20 at 10:02

0 Answers0