1

i am loading a CSV file through PHP and inserting into a temporary table, this is what my loading PHP script and MySQL query.

    $i=0;
    while (($data = fgetcsv($source, 1000, ",")) !== FALSE)
        {   
            if($i!=0) {
                $column1=$data[0];
                $column2=$data[1];
                $column3=$data[2];
                $column4=$data[3];
                $column5=$data[4];
                $column6=$data[5];
                $query= "INSERT INTO temp_table(column1,column2,column3,column4,column5,column6,load_datetime)
    VALUES (nullif('$column1',''), nullif('$column2',''), nullif('$column3',''), nullif('$column4',''), nullif('$column5',''), nullif('$column6',''),now())";
                mysql_query($query) or die(mysql_error());
            }
            $i++;
        }

and my CSV file has 25000 records, this is a sample row from my CSV

        column1,column2,column3,column4,column5,column6
        ,,,13:57:18,,23:00:19

the problem i face is, it takes more than 10 minutes when i load the CSV and in between the page goes blank. My PHP settings: upload_max_filesize: 100M post_max_size:100M max_execution_time:1000 max_input_time:1000.

this my table detail

   CREATE TABLE temp_table (
       id int(11) NOT NULL,
       column1 time DEFAULT NULL,
       column2 time DEFAULT NULL,
       column3 time DEFAULT NULL,
       column4 time DEFAULT NULL,
       column5 time DEFAULT NULL,
       column6 time DEFAULT NULL,
       load_datetime datetime DEFAULT NULL
   ) ENGINE=InnoDB DEFAULT CHARSET=utf8;

and when i check the temp_table it has inserted around 8000 records. please help me with an approach to insert faster.

nacho
  • 5,280
  • 2
  • 25
  • 34
davidb
  • 263
  • 5
  • 10
  • 23

2 Answers2

1

I would be recommended to you stop use mysql extension (mysql_*)
and start use mysqli or PDO.

Try to group data of a few rowns into one query.

Example:

$i  = 0;
$i2 = 0;
$sql = "INSERT INTO temp_table(column1,column2,column3,column4,column5,column6,load_datetime) VALUES ";
$values = [];
while (($data = fgetcsv($source, 1000, ",")) !== FALSE) {
    if(++$i == 1)
        continue;
    else {
        $data = array_map(function($item) {
            return $item == '' ? 'NULL' : "'$item'";
        }, $data);
        $values[] = "(" . implode(", ", $data) . ', now()' . ")";
        if (++$i2 == 500) {
            mysql_query ($sql . implode(', ', $values)) or die(mysql_error());
            $values = [];
            $i2 = 0;
        }
    }
}
if (!empty($values)) {
    mysql_query ($sql . implode(', ', $values)) or die(mysql_error());
    $values = [];
}
zessx
  • 68,042
  • 28
  • 135
  • 158
Neodan
  • 5,154
  • 2
  • 27
  • 38
0

I think It's taking a time in PHP script. Try below code and let me know how much it's improved and also you can group 50 to 100 records at a time.

 while (($data = fgetcsv($source, 1000, ",")) !== FALSE)
{   
if($i!=0)
{
$query= "INSERT INTO temp_table(column1,column2,column3,column4,column5,column6,load_datetime)
VALUES (nullif('$data[0]',''), nullif('$data[1]',''), nullif('$data[2]',''), nullif('$data[3]',''), nullif('$data[4]',''), nullif('$data[5]',''),now())";
mysql_query($query) or die(mysql_error());
}
$i++;
}
Bhushan
  • 595
  • 3
  • 9
  • thank you i have tried this, it is taking same time as before. – davidb Oct 03 '17 at 12:30
  • Can I know how much time it's taking while directly executing in DB same as it's written in php code? – Bhushan Oct 03 '17 at 12:35
  • Also in create table code, id should be auto increment field or it should have default value. – Bhushan Oct 03 '17 at 12:38
  • 1
    If it's not taking more time in query execution then you can go through below articles. https://stackoverflow.com/questions/6245971/accurate-way-to-measure-execution-times-of-php-scripts By this you can know where in PHP code it's taking more time. – Bhushan Oct 03 '17 at 12:45
  • the id field in the table is primary key and set as auto increment. – davidb Oct 03 '17 at 12:54
  • @Jack The communication between PHP and MySQL is the slowest part in this situation. So for OP just need to reduce the request amount. – Neodan Oct 03 '17 at 13:15