5

I am making a module which you upload records on a database from an excel file. Those are just phone numbers. So here's my code:

        $file = Input::file('file');
        Excel::load($file, function($reader) {
            // Getting all results
            $results = $reader->get()->toArray();
            //var_dump($results);exit;
            foreach ($results as $key => $value) {
                $phone = new Phone();
                $phone->msisdn          =  $value['msisdn'];
                $phone->save();
            }
        });

I'm using https://github.com/Maatwebsite/Laravel-Excel to read the excel file. It works fine, 20,000 records uploads in 20mins I guess, is there a way to to it or upload it faster? I know that it depends also in the server but is there other factors? I'm using MySQL

Thanks

jackhammer013
  • 2,295
  • 11
  • 45
  • 95
  • 1
    20000 is 20 minutes seems very slow. Can you log how long each operation takes to see where the bottle neck is? – Halcyon Sep 08 '15 at 14:21
  • Can you not save the (relevant bits of the) Excel file as a CSV and then use `LOAD DATA` or mysqlimport? – eggyal Sep 08 '15 at 14:25
  • The main issue here is that you're inserting in a loop. Instead of using Eloquent, go for the Query Builder. [Here](http://stackoverflow.com/questions/20248107/laravel-4-inserting-multiple-records-when-using-the-hasmany-relationship#answer-20249503) is an example to get you going. – Andrei Sep 08 '15 at 14:35
  • The problem is almost certainly on the Excel side of things. That package uses PHPExcel, which is an incredible package, but **extremely** resource intensive. [This question](http://stackoverflow.com/q/563670/2008384) asks about different packages for reading Excel files in PHP. I haven't researched these, so I can't speak for any of them. If you can save the Excel file as CSV, that's going to give you the best performance – Kryten Sep 08 '15 at 15:01
  • I would think looking at bulk inserts or `LOAD DATA FROM INFILE` are your best bets here. I would also think you could save time by not reading entire into memory if going the bulk insert route. Just read each line of file and continue to build bulk insert query, triggering the actual insert when query gets to number of records you want to insert at a time. I definitely would not take extra step of instantiating an Phone object with each insert unless you need to do something like validate/transform the data before making the insert. – Mike Brant Sep 08 '15 at 15:45
  • can you please tell what should be the excel file format? – StealthTrails Oct 31 '15 at 21:08

2 Answers2

0

From their documentation possibly chunk the results as you are using such a large file. That way you can read in chunks of the file rather than loading the whole file into memory.

http://www.maatwebsite.nl/laravel-excel/docs/import#chunk

Leo
  • 1,521
  • 12
  • 18
0

You could try this faster alternative (https://github.com/rap2hpoutre/fast-excel):

(new FastExcel)->import($file, function ($line) {
    $phone = new Phone();
    $phone->msisdn = $value['msisdn'];
    $phone->save();
});
rap-2-h
  • 30,204
  • 37
  • 167
  • 263