-1

i am using laravel and i need to upload bulk (~ 1 million) of data (in .xlsx file) to the mysql using laravel....

currently i am uploading data near 200k and it is uploading to the mysql with phpoffice-phpspreadsheet package....

Here is my code to upload the data....

if($extension == 'xlsx')
{
    info('Entered');
    try{
        $spreadsheet = IOFactory::load($tempPath);                    
        $sheet        = $spreadsheet->getActiveSheet();
        $row_limit    = $sheet->getHighestDataRow();
        $column_limit = $sheet->getHighestDataColumn();
        $row_range    = range( 2, $row_limit );
        $column_range = range( 'AH', $column_limit );
        $startcount = 2;
        $data = array();
        $counter = 0;
        foreach ( $row_range as $row ) {
            $counter++;
            if($counter%5 == 0)
            {
                info('Next 5');
            }
            $data = [
                'id' =>$sheet->getCell( 'A' . $row )->getValue(),
                'ACNo' => $sheet->getCell( 'B' . $row )->getValue(),
                'PartNo' => $sheet->getCell( 'C' . $row )->getValue(),
                'SectionNo' => $sheet->getCell( 'D' . $row )->getValue(),
                'SlNo' => $sheet->getCell( 'E' . $row )->getValue(),
                'HouseNo' =>$sheet->getCell( 'F' . $row )->getValue(),
                'VHouseNo' =>$sheet->getCell( 'G' . $row )->getValue(),
                'EName' =>$sheet->getCell( 'H' . $row )->getValue(),
                'VEName' =>$sheet->getCell( 'I' . $row )->getValue(),
                'sex' =>$sheet->getCell( 'J' . $row )->getValue(),
                'RName' =>$sheet->getCell( 'K' . $row )->getValue(),
                'VRName' =>$sheet->getCell( 'L' . $row )->getValue(),
                'RType' =>$sheet->getCell( 'M' . $row )->getValue(),
                'Age' =>$sheet->getCell( 'N' . $row )->getValue(),
                'IDCardNo' =>$sheet->getCell( 'O' . $row )->getValue(),
                'statustype' =>$sheet->getCell( 'P' . $row )->getValue(),
                'Addition' =>$sheet->getCell( 'Q' . $row )->getValue(),
                'ContactNo' =>$sheet->getCell( 'R' . $row )->getValue(),
                'Address' =>$sheet->getCell( 'S' . $row )->getValue(),
                'VAddress' =>$sheet->getCell( 'T' . $row )->getValue(),
                'PSName' =>$sheet->getCell( 'U' . $row )->getValue(),
                'VPSName' =>$sheet->getCell( 'V' . $row )->getValue(),
                'whatsappNumber' =>$sheet->getCell( 'W' . $row )->getValue(),
                'caste' =>$sheet->getCell( 'X' . $row )->getValue(),
                'casteCategory' =>$sheet->getCell( 'Y' . $row )->getValue(),
                'education' =>$sheet->getCell( 'Z' . $row )->getValue(),
                'currentCity' =>$sheet->getCell( 'AA' . $row )->getValue(),
                'currentDistrict' =>$sheet->getCell( 'AB' . $row )->getValue(),
                'currentState' =>$sheet->getCell( 'AC' . $row )->getValue(),
                'livelihood' =>$sheet->getCell( 'AD' . $row )->getValue(),
                'voted' =>$sheet->getCell( 'AE' . $row )->getValue(),
                'support' =>$sheet->getCell( 'AF' . $row )->getValue(),
                'created_at' =>$sheet->getCell( 'AG' . $row )->getValue(),
                'updated_at' =>$sheet->getCell( 'AH' . $row )->getValue(),

            ];
            $startcount++;
            Voter::create($data);
        }                   
    } 
    catch (Exception $e) 
    {
        // $error_code = $e->errorInfo[0];
        // 
        // echo $error_code;
        echo $e;
        exit();
    }
}

Here

$extension is the extension of file which should be .xlsx

and i need to optimize the flow.... so i can upload nearly 1 million of data using this method

RiggsFolly
  • 93,638
  • 21
  • 103
  • 149
Yash Bohra
  • 140
  • 7
  • I expect a lot of the time is taken up just opening a spreadsheet that large, that you will not be able to optimise. And the rest grabbing cell data from the ss. I would consider looking at exporting the xlsx to a csv and then using [LOAD DATA](https://www.mysqltutorial.org/import-csv-file-mysql-table/) feature of MySQL – RiggsFolly Sep 13 '22 at 13:40
  • 1
    Which takes longer? Extracting the data from Excel? Or inserting that data into MySQL? – Rick James Sep 14 '22 at 04:54
  • it takes a lot of time while loading the file..... it takes almost 16m32sec to load and upload the file containing 247k data with around 40 columns – Yash Bohra Sep 15 '22 at 13:18
  • @RiggsFolly firstly i did this with converting .csv but the problem is file will definitely contain HINDI ( Indian ) Language... that was changing in ? ( question mark ). – Yash Bohra Sep 15 '22 at 13:22
  • That was not part of your question. I cannot have known you tried that, you didnt put that in your question. Then you probably need to read [UTF-8 All the way through](https://stackoverflow.com/questions/279170/utf-8-all-the-way-through) Please remember we are not clairvoyant and we are not looking over your shoulder, if you dont put relevant facts in the question they dont exist – RiggsFolly Sep 15 '22 at 13:27
  • i have not mentioned because i don't want to convert file into csv.... as per requirement i need to upload from xlsx not csv.... not just to take xlsx and converting in backend but also xlsx in backend..... so i just asked what i need to provide code – Yash Bohra Sep 15 '22 at 13:50
  • all i need is to optimize the code..... it takes a long time in loading the file but the uploading of data is good.... it uploads the 250k rows with 40 columns in 8 minutes – Yash Bohra Sep 15 '22 at 13:53

1 Answers1

0

You can use Laravel Excel and they provide chunk reading for very large files. I have used it myself to import around 4.5 milion records and works like a charm! Find the documentation below https://docs.laravel-excel.com/3.1/imports/chunk-reading.html

Kevin
  • 1,152
  • 5
  • 13