0

Can I get an idea; how can I import 70k+ rows to the database from CSV avoiding a lot of memory usage and time in Laravel (PHP)

Part of my code how I am trying to import:

                $reader = new \PhpOffice\PhpSpreadsheet\Reader\Csv();
                $reader->setInputEncoding('UTF-8');


                $spreadsheet = $reader->load($localpath);

                $spreadsheet = $spreadsheet->getActiveSheet()->toArray();
                $collection = LazyCollection::make($spreadsheet);

               if($key==null){
                  $key = array_map('trim', $collection->first());
                  $key_count=count($key);
               }

               $collection = $collection->except(0);

               foreach($collection as $row){
                if(count($row)==$key_count && !containsOnlyNull($row)){
                    $array[] = array_combine($key, $row);
                }
              }
itsrajon
  • 283
  • 2
  • 15
  • Do you have any written code to do that ? Do you use an ORM to load your file ? as a general matter, avoiding memory usage is done by avoiding to store a lot of variables. Using an ORM will consume a lot of memory (depending on the ORM you use), the first step is by reading CSV iterating a file pointer instead of using a `file_get_content()` to get all the file in memory – SmasherHell Apr 27 '20 at 10:22
  • 1
    Not Laravel specific, but still probably the best way to do it - https://stackoverflow.com/questions/11448307/importing-csv-data-using-php-mysql – Nigel Ren Apr 27 '20 at 10:22
  • Hey, @SmasherHell I have added part of my code in question. can you please check! – itsrajon Apr 27 '20 at 10:29
  • @NigelRen I am doing field mapping and the structure of my CSV file varies. maybe it will not work in my case – itsrajon Apr 27 '20 at 10:38
  • @Pusku in your foreach you should do your business logic persisting data (line per line or every 100 lines), because as I see it, you're using a `LazyCollection` for nothing as you store the entire CSV in your `$array` – SmasherHell Apr 27 '20 at 12:07

2 Answers2

3

You can use laravel-excel with chunk reading for that https://docs.laravel-excel.com/3.1/imports/chunk-reading.html

To mitigate this increase in memory usage, you can use the WithChunkReading concern. This will read the spreadsheet in chunks and keep the memory usage under control.

This is an example of code you could write:

namespace App\Imports;

use App\User;
use Maatwebsite\Excel\Concerns\ToModel;
use Maatwebsite\Excel\Concerns\WithChunkReading;

class UsersImport implements ToModel, WithChunkReading
{
    public function model(array $row)
    {
        return new User([
            'name' => $row[0],
        ]);
    }

    public function chunkSize(): int
    {
        return 1000;
    }
}
Christophe Hubert
  • 2,833
  • 1
  • 12
  • 25
1

I was able to import this file with minimal usages of memory and time using LazyCollection. Here is the code sample how I did that:

    LazyCollection::make(function () use (&$csv_data,$key_count){
        $datas = $csv_data;
        foreach($datas as $line)
        {
            if(count($line)==$key_count && !containsOnlyNull($line)){
                yield $line;
            }
        }
    })
    ->chunk(1000)
    ->each(function ($lines) use (&$import_info, $user_id, $country, $csv_fields, $key, $total, $products,$allProducts, $sync_id,$import_id){
}):
itsrajon
  • 283
  • 2
  • 15