I am trying to upload a .csv or .xls file that contains about 10,000 records. For the first attempt, the function/code below reads the csv file and were able to insert about 1600 rows/records into the transactions table in ~2 minutes, without the code line: set_time_limit(0); And i got: Maximum execution timeout error
On second attempt, now with the code: set_time_limit(0); This time around, 10,000 records were inserted in ~11 minutes.
My questions now are:
1.How can I read a csv/xls file and insert its records/rows which is >= 10,000, into database in Laravel app, very fast?
2.The practice, 'set_time_limit(0)', I used here, is it a good one? Is there a better way I can go about this?
3.Before reading and insertion takes place, is there away i can determine the time/maximum time it could take to complete the operation/execution?
public function importDataSet(Request $request)
{
if(Input::hasFile('file')){
$file = Input::file('file');
$uname = uniqid();
$oname = $file->getClientOriginalName();
$filename = $uname.$oname;
//create a folder, data-sets, if not exist in the public folder
$filelocation = 'data-sets';
$filepath=$file->move($filelocation, $filename);
ini_set('auto_detect_line_endings', TRUE);
$header = '';
$rows = array_map('str_getcsv', file($filepath));
$header = array_shift($rows);
$csv = array();
set_time_limit(0);//reset maximum execution time to infinity for this function
foreach ($rows as $row) {
$csv[] = $row;//Is not used
//eturn $rows;
$transaction = new Transaction;
$transaction->dataset_id = 1;
$transaction->itemsets=json_encode($row);
$transaction->save();
}
}else{
//noting here
}
return 'Success';
}