0

I am new to working with MySQL and am looking for some guidance with this problem.
I am entering lots of data into a MySQL table.
I have set it up using this format so the VALUES are in a loop and the INSERT part only happens once:
INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);

I have read SO post: How to do a batch insert in MySQL

I am trying to insert up to 160,000 rows of data. My php script seems to get stuck after a short time and only 6499 rows of data get inserted to the table.

I think I am having a problem with the Max_Allowed_Packet

I am using hosting with iPage and I am on a shared server. I've asked iPage about Max_Allowed_Packet but I'm told this can only be increased if on a Virtual Private Server (VPS). This project is not worth paying for a private server at this time.

I'm looking for coding ways to insert all this data whilst on a shared server.
Is submitting 1000 lines at a time an option?
I'm not sure if this is a good solution, I'm grateful for any ideas.

Here is the section of the code where the data is inserted:

 // prepare first part of the query (before values)
                      $query = " INSERT INTO `Events_testing2` (
                          `eventID`,
                          `location`,
                          `date`,
                          `barcode`,
                          `runner`,
                          `time`,
                          `Run Points`,
                          `Volunteer Points`,
                          `Gender`,
                          `Gender pos`
                      ) VALUES ";


                      //loop the runner array

                      for($x=0;$x<count($array_runner); $x++){
                          $query_values[] = "(' ','$location','$date','$array_barcode[$x]','$array_runner[$x]','$array_time[$x]','$array_score[$x]','','$array_gender[$x]','$array_gender_pos[$x]')";
                      }


                      // put the code together to insert to the DB

                      $queryInsertUser=mysqli_query($link,$query.implode(',',$query_values));
Community
  • 1
  • 1
Jeanclaude
  • 189
  • 1
  • 4
  • 15
  • You might be better using a prepared statement with the 10 columns and executing the statement in a loop rather than trying to create a huge statement and exectuing once – Professor Abronsius May 03 '17 at 09:48
  • You can't do it in batches? Split your dataset into chunks and for each chuck insert data. – Andreas May 03 '17 at 10:01

1 Answers1

0

As per the comment, if you were to use a prepared statement you might find a performance difference when trying to insert thousands of records. Hopefully the following will give an idea as to how you could achieve that - though it is of course untested.

<?php

    $dbhost =   'localhost';
    $dbuser =   'root'; 
    $dbpwd  =   'xxx'; 
    $dbname =   'xxx';
    $db =   new mysqli( $dbhost, $dbuser, $dbpwd, $dbname );


    $sql='insert into `Events_testing2` (
      `eventID`,
      `location`,
      `date`,
      `barcode`,
      `runner`,
      `time`,
      `Run Points`,
      `Volunteer Points`,
      `Gender`,
      `Gender pos`
    ) values (?,?,?,?,?,?,?,?,?,?)';

    $stmt=$db->prepare( $sql );


    if( $stmt ){

        $stmt->bind_param('isssssiiss',$id,$location,$date,$barcode,$runner,$time,$runpoints,$volpoints,$gender,$genpos);
        /*
            $location & $date seem to be defined already
        */
        for( $x=0; $x < count( $array_runner ); $x++ ){
            $id=null;
            $barcode=$array_barcode[$x];
            $runner=$array_runner[$x];
            $time=$array_time[$x];
            $runpoints=$array_score[$x];
            $volpoints=null;
            $gender=$array_gender[$x];
            $genpos=$array_gender_pos[$x];


            $stmt->execute();
        }
        $stmt->close();
        $db->close();
    }
?>
Professor Abronsius
  • 33,063
  • 5
  • 32
  • 46