-1

I have a large mysql result set which I insert into other tables as I loop through.

//Code
$result= mysqli_query($daffff, "select * from 0_suppliers");
    while ($tran_pucha1we = mysqli_fetch_array($result)) {
       //Several Inserts here
    }

The result set has around 3000 suppliers and inside the loop I insert into 4 other different tables which makes the process time out.

What ways are available to handle such large result sets without timing out?

Sam
  • 29
  • 1
  • 7

4 Answers4

2

You should use INSERT query with SELECT statement which is much faster. Example:

INSERT INTO table (column1, column2)
SELECT column1, 
column2 
FROM table2

where table2 is your table 0_suppliers.

b4rt3kk
  • 1,419
  • 3
  • 17
  • 26
2
  • Use transactions explicitly instead of relying on autocommit, to reduce overhead. You can do many statements in a transaction. But 12,000 is probably the upper limit of what I'd be confident in running in a single transaction.

    mysqli_begin_transaction($daffff);
    $result= mysqli_query($daffff, "select * from 0_suppliers");
    while ($tran_pucha1we = mysqli_fetch_array($result)) {
       //Several Inserts here
    }
    mysqli_commit($daffff);
    
  • Prepare the INSERT statements before the loop, and just execute them repeatedly within the loop.

    $result= mysqli_query($daffff, "select * from 0_suppliers");
    
    $insert1 = mysqli_prepare($daffff, "INSERT INTO MyTable1 (col1, col2, col3) VALUES (?, ?, ?)");
    $insert2 = mysqli_prepare($daffff, "INSERT INTO MyTable2 (col1, col2, col3) VALUES (?, ?, ?)");
    
    while ($tran_pucha1we = mysqli_fetch_array($result)) {
    
       mysqli_stmt_bind_param($insert1, "sss", $tran_pucha1we[0], $tran_pucha1we[1], $tran_pucha1we[2]);
       mysqli_stmt_execute($insert1);
    
       mysqli_stmt_bind_param($insert2, "sss", $tran_pucha1we[3], $tran_pucha1we[4], $tran_pucha1we[5]);
       mysqli_stmt_execute($insert2);
    
    }
    
  • Collect a bunch of rows of your SELECT query, and use multi-row insert syntax.

    INSERT INTO MyTable (col1, col2, col3) VALUES
      (?, ?, ?), (?, ?, ?), (?, ?, ?), ...
    

You can combine all of the above techniques.

You may also like to know that you can change the PHP timeout. See this question: How to increase the execution timeout in php?

Bill Karwin
  • 538,548
  • 86
  • 673
  • 828
0

here is pagination using your code it is timeout proof, but vulnerable as a result. dont use this maybe, just get an idea.

$howMany = queryExec("select count(*) from 0_suppliers") -> fetch_array()[0];
$current = 0;
// this is how many records to process per loop, if it times out still, make it smaller.   
$numToProcess = 100;

// this is what you really came for...  but be careful:
set_time_limit();

while($current <= $howMany){
    $result= mysqli_query($daffff, "select * from 0_suppliers limit $numToProcess, $current");
    while ($tran_pucha1we = mysqli_fetch_array($result)) {
       //Several Inserts here

       $current++;
    }
}

as many others have suggested, it is indeed a better idea to refactor and do a multi-line insert if possible, it is MUCH faster.

NappingRabbit
  • 1,888
  • 1
  • 13
  • 18
0

I recommend using limit and offset

MySQL Limit & Offset documentation can be found here.

Examples can be found here.

Mxlvin
  • 288
  • 3
  • 10