The problem
I am using Laravel 5.3
to import a huge (about >1 million rows
and >25 columns
) tab separated file into mysql
database using functions in controller code (I am restraining from posting all the code here). While processing the files I am encountered with the following error:
FatalErrorException in Connection.php line 720:
Maximum execution time of 30 seconds exceeded
Please note that the application is importing a different number of rows for different instances before failing.
Question
I know we can fix this using either of following:
- changing
php.ini
suggested here - adding
ini_set('max_execution_time', 300);
at the beginning ofpublic/index
as suggested here
A varied number of reasons might be behind this and I am more interested in knowing where exactly is it running out of time. Laravel
doesn't provide any more details than the above-mentioned. I would really appreciate if someone can provide ways to debug this. Things that would help:
- Is the time aggregate of all requests by a method?
- Does memory overload cause this?
- Will it help by chunking the data and handling it through multiple request?
Environment
Laravel 5.3
Centos 7
onvagrant
MySQL