0

I am new to PHP I am trying to load a large 14MB .csv into the mysql table. But it is not fully uploaded into db, probably due to large file (~400000 rows). ERROR page took too long to respond.

Is there any faster way to do it.

My DB on Amazon RDS, PHP on EC2.

My current code is

<?php
header('Access-Control-Allow-Origin: *');   

require "../config.php";

//$user_id = $_REQUEST['user_id'];





// Create connection
$conn = new mysqli($servername, $username, $password, $dbname);
// Check connection
if ($conn->connect_error) {
    die("Connection failed: " . $conn->connect_error);
} 

// path where your CSV file is located
define('CSV_PATH','./');
$csv_file = CSV_PATH . "data_unique.csv";

if (($handle = fopen($csv_file, "r")) !== FALSE) {
   fgetcsv($handle);   
   while (($data = fgetcsv($handle, 1000, ",")) !== FALSE) {
        $num = count($data);
        for ($c=0; $c < $num; $c++) {
          $col[$c] = $data[$c];
        }
        $col1 = $col[0];
        $col2 = $col[1];
        $col3 = $col[2];
        $col4 = $col[3];
        $col5 = $col[4];
        $col6 = $col[5];
        // SQL Query to insert data into DataBase
        $query = "INSERT INTO uniqueid_master(autoid,package_id,unique_id,user_id,issued,book_code) VALUES('".$col1."','".$col2."','".$col3."','".$col4."','".$col5."','".$col6."')";
        $result = $conn->query($query);
    }

    fclose($handle);
}


echo "File data successfully imported to database!!";
$conn->close();
?>
raju
  • 6,448
  • 24
  • 80
  • 163
  • Please try this `ini_set('max_execution_time', -1); ini_set('memory_limit', -1);` otherwise you have set it in php.init file fot this. – Arshad Shaikh Jun 01 '17 at 05:03
  • Also look into using prepared statements and also inserting multiple rows using insert (https://stackoverflow.com/questions/6889065/inserting-multiple-rows-in-mysql) – Nigel Ren Jun 01 '17 at 05:53

2 Answers2

2

I think you should try LOAD DATA Mysql statement.This will be very fast since you don't have to read everything into php.

mysqli_query($dblink, '
    LOAD DATA LOCAL INFILE "'.$file.'"
        INTO TABLE transactions
        FIELDS TERMINATED by ","
        OPTIONALLY ENCLOSED BY "\'"
        LINES TERMINATED BY "\n"
');

could be write like this too:

$sql = "LOAD DATA LOCAL INFILE '/path/to/file.csv' 
    REPLACE INTO TABLE table_name FIELDS TERMINATED BY ',' 
    ENCLOSED BY '\"' LINES TERMINATED BY '\r\n' IGNORE 1 LINES";
$result = $mysqli->query($sql);

OR

For an alternate method : Refer to this question too

NID
  • 3,238
  • 1
  • 17
  • 28
0

This will improve some of PHP performance variables.

ini_set('memory_limit','-1');
ini_set('max_execution_time', 0);

This might solve your problem. But , it could still be possible, that, you reach a memory error. In that case, divide the csv file into multiple chunks and handle them one after another. This is a good way to handle it.

How to extract data from csv file in PHP

Umashankar Das
  • 601
  • 4
  • 12