0

Using the import to models option, I am importing an XLS file with about 15,000 rows.

With the microtime_float function, the script times and echos out how long it takes. At 29.6 secs, this happens, showing it took less than 30 seconds. At that time, I can see the database has all 15k+ records as expected, no issues there.

Problem is, the browser is kept busy and at 1 min 22 secs, 1 min 55 secs and 2 min 26 secs it prompts me to either wait or kill the process. I keep clicking wait and finally it ends at 2 mins 49 secs.

This is a terrible user experience, how can I cut off this extra wait time?

It's a very basic setup: the route calls importcontroller@import with http get and the code is as follows:

public function import()
{
    ini_set('memory_limit', '1024M');
    $start = $this->microtime_float();
    Excel::import(new myImport, 'myfile.xls' , null, \Maatwebsite\Excel\Excel::XLS);
    $end = $this->microtime_float();

    $t = $end - $start;
    return "Time: $t";
}

The class uses certain concerns as follows:

class myImport implements ToModel, WithBatchInserts, WithChunkReading, WithStartRow 
user1729972
  • 712
  • 2
  • 9
  • 29
  • Hi, `set_time_limit(0);` or `ini_set('max_execution_time','-1');` or change your configuration like here if you have an access. https://stackoverflow.com/questions/3829403/how-to-increase-the-execution-timeout-in-php – Teymur Mardaliyer Lennon Sep 08 '20 at 13:32
  • Thanks but the issue is not that it times out. Issue is the script is busy long after the import process is complete (which I can see in the database) – user1729972 Sep 08 '20 at 20:14
  • Do you call your endpointt with XHR (ajax) call? – Teymur Mardaliyer Lennon Sep 09 '20 at 18:09
  • Route typed directly in browser and called from another php script using guzzle and tested also with CURL directly. My use case is calling it from an old non-laravel php app – user1729972 Sep 10 '20 at 16:27

0 Answers0