For a customer, I need to upload a cvs file. The file has nearly 35000 lines. I used maatwebsite/excel
package.
Excel::filter('chunk')->load($file->getRealPath())->chunk(100,
function($results) {
foreach ($results as $row) {
// Doing the import in the DB
}
}
});
I can't change the max_execution_time
because our server doesn't allow executions more than 300 seconds.
I tried also tried another way without any package but that failed also.
$csv = utf8_encode(file_get_contents($file));
$array = explode("\n", $csv);
foreach ($array as $key => $row) {
if($key == 0) {
$head = explode(',', $row);
foreach ($head as $k => $item) {
$h[$key][] = str_replace(' ', '_', $item);
}
}
if($key != 0) {
$product = explode(',', $row);
foreach ($product as $k => $item) {
if($k < 21)
$temp[$key][$h[0][$k]] = $item;
}
}
}
foreach ($temp as $key => $value) {
// Doing the import in the DB
}
Does anyone have an idea?
Edit:
So I made an artisan command. When I execute this in terminal it get's executed and all 35000 rows are imported. Thanks to common sence.
I just can't figure out how to make the command run asynchrone so the user can close his browser. Can anyone explain how to get that done?