0

I have a text file to read which has around 10000 points separated by, x1,y1 x2,y2 x3,y3 . . 10000 times

I read them using a loop in PHP and then store in an array and then I run a loop and insert one line at a time in my database. It takes really long time. Is there any way I can insert the whole array

for ($i=0; $i<10000; $i++)

{
    $sql = '
        INSERT INTO `firefly`.`FreeFormPoly` 
            (`markedObjectID`, `order`, `x`, `y`) 
        VALUES 
            ('.$markedObjectsID.', '.$order.', '.$valuesx[i].','.$valuesy[i].')';

    $db->query($sql, $markedObjectsID, $order, $values[1], $values[0]);
}
TravisO
  • 9,406
  • 4
  • 36
  • 44
user1733735
  • 423
  • 1
  • 6
  • 20

4 Answers4

2

Try using multiple insert statement. Generate one insert and submit the entire statement

INSERT INTO tbl_name (a,b,c) VALUES(1,2,3),(4,5,6),(7,8,9);

SO:

$sql = 'INSERT INTO `firefly`.`FreeFormPoly` (`markedObjectID`, `order`, `x`, `y`) VALUES';
for($i=0;$i<10000;$i++) {
    if($i != 0) $sql .= ',';
    $sql .= '('.$markedObjectsID.', '.$order.', '.$valuesx[i].','.$valuesy[i].')';                         
}
$db->query($sql);
beiller
  • 3,105
  • 1
  • 11
  • 19
  • 1
    There is a limit to how big of an INSERT you can sent. You might need an extra loop around this one. Depending on the size of the SQL, I recall having to limit it to 3,000 grouped inserts, but everybody's will vary. – TravisO Jun 28 '13 at 18:29
  • Perhaps there is a limit. In that case "batch" them up into as many as you can. No doubt the slowdown is from the SQL parsing so many statements. – beiller Jun 28 '13 at 18:41
0

I would do something like this:

$sql = 'INSERT INTO `firefly`.`FreeFormPoly` (`markedObjectID`, `order`, `x`, `y`) VALUES';
for($i=0;$i<length;$i++) {
    $sql .= '('.$markedObjectsID.', '.$order.',  .$valuesx[i].','.$valuesy[i].'),';
}
$sql = substr($sql,0,-1);
$db->query($sql);

Explanation:

The syntax to enter multiple records is

INSERT INTO TABLE_NAME VALUES(VAL1, VAL2, ....), (...), (...);

In the SQL you are concatenating (val1,val2,val3), every time you execute the loop hece you will get an extra , in the last position and substr() trims it off.

More preferably I would do

$sql = 'INSERT INTO `firefly`.`FreeFormPoly` (`markedObjectID`, `order`, `x`, `y`) VALUES ';
for($i=0;$i<length;$i++) {
    $sql .= '('.$markedObjectsID.', '.$order.',  .$valuesx[i].','.$valuesy[i].'),';
}
$sql = substr($sql,0,-1);
$result = mysqli_query($db,$sql)
    or die('Error in querying the database');
NewUser
  • 3,729
  • 10
  • 57
  • 79
0

You should be able to speed it up considerably by sending BEGIN TRANSACTION before the loop and COMMIT after the loop. One time I had to insert 14,000 data points on SQLite, and it took 20 minutes, but when I put the data in as a transaction it completed in 0.3 seconds.

Philip Kearns
  • 377
  • 4
  • 14
0

First off, you can use prepared statements to reduce the network overhead. Assuming PDO:

$stmt = $db->prepare('INSERT INTO mytable (foo, bar, baz) VALUES (:foo, :bar, :baz)');

for ($i = 0; $i < $length; ++$i) {
    $stmt->execute(array(
        ':foo' => $data[$i]['foo'],
        ':bar' => $data[$i]['bar'],
        ':baz' => $data[$i]['baz'],
    ));
}

Second, you could wrap the whole code inside $db->beginTransaction() and $db->commit().

Ja͢ck
  • 170,779
  • 38
  • 263
  • 309