0

I am trying to upload a .csv or .xls file that contains about 10,000 records. For the first attempt, the function/code below reads the csv file and were able to insert about 1600 rows/records into the transactions table in ~2 minutes, without the code line: set_time_limit(0); And i got: Maximum execution timeout error

On second attempt, now with the code: set_time_limit(0); This time around, 10,000 records were inserted in ~11 minutes.

My questions now are:

1.How can I read a csv/xls file and insert its records/rows which is >= 10,000, into database in Laravel app, very fast?

2.The practice, 'set_time_limit(0)', I used here, is it a good one? Is there a better way I can go about this?

3.Before reading and insertion takes place, is there away i can determine the time/maximum time it could take to complete the operation/execution?

public function importDataSet(Request $request)
{

    if(Input::hasFile('file')){
        $file = Input::file('file');
        $uname = uniqid();
        $oname = $file->getClientOriginalName();
        $filename = $uname.$oname;
        //create a folder, data-sets, if not exist in the public folder
        $filelocation = 'data-sets';
        $filepath=$file->move($filelocation, $filename);

        ini_set('auto_detect_line_endings', TRUE);

        $header = '';
        $rows = array_map('str_getcsv', file($filepath));
        $header = array_shift($rows);
        $csv = array();
        set_time_limit(0);//reset maximum execution time to infinity for this function
        foreach ($rows as $row) {
            $csv[] =  $row;//Is not used
            //eturn $rows;
            $transaction = new Transaction;
            $transaction->dataset_id = 1;
            $transaction->itemsets=json_encode($row);
            $transaction->save();
        }
    }else{
        //noting here
    }
    return 'Success';
}
Ezehlivinus
  • 83
  • 1
  • 9

1 Answers1

0

This is, in my opinion, one of the big disadvantages of an active-record ORM like Eloquent. Now you're executing a database query for every record you save, which inherently is 10,000 queries.

One easy way to solve it is by using mysqlimport, since you have a .csv file anyway. This mostly applies to one-off uploads though.

Another is to use the insert() commands of your model or query builder. There's already a great answer here that pretty much directly answers your question.

Loek
  • 4,037
  • 19
  • 35
  • Thanks. Your answer solve the problem. The reading and insertion of the **10,000** records now took about: 13 seconds – Ezehlivinus May 09 '18 at 21:28