0

I built a laravel app that reads excel files as a collection using laravel-excel and store the data into the database after looping over the result, each row represents one model so inside the loop I create a new model, assign the attributes and save the model and so on till the rows are finished.

This technique works well when I read small excel files (less than 2000 rows) but when I deal with large excel files (13000 rows or more) the app starting to crash so I set the max_execution_time=300 in php.ini but this also didn't help.

Any idea or technique to make this works.

Thanx

Ya Basha
  • 1,902
  • 6
  • 30
  • 54
  • I receive `504 Gateway Time-out` when reading large files – Ya Basha Oct 09 '19 at 07:54
  • Did you restart your server after changing the `php.ini` file? You might have to increase your `max_execution_time`. If it's only a one time insert then you can make `max_execution_time=0` for infinite length to complete the insert and change it back afterwards. Check out this answer: https://stackoverflow.com/questions/16171132/how-to-increase-maximum-execution-time-in-php – user931018 Oct 09 '19 at 08:17
  • Yes I restarted the server – Ya Basha Oct 09 '19 at 08:20
  • in the server error log: upstream timed out (110: Connection timed out) while reading response header from upstream request: POST /process HTTP/2.0, upstream: php7.2-fpm.sock – Ya Basha Oct 09 '19 at 08:21
  • Can you see this https://docs.laravel-excel.com/3.1/imports/chunk-reading.html – Eyad Jaabo Oct 09 '19 at 09:20
  • Laravel chunk reading doesn't work because i read the excel file into a collection not to a model directly – Ya Basha Oct 09 '19 at 09:37

0 Answers0