0

I have to insert around 75000 numbers of data which is to be retrieved from another table calculation. I have tried the following code.

$start    = new DateTime('2018-09-01');
$end      = new DateTime('2018-12-31');
$interval = DateInterval::createFromDateString('1 month');
$period   = new DatePeriod($start, $interval, $end);

// Create connection
$conn = new mysqli($servername, $username, $password, $dbname);
// Check connection
if ($conn->connect_error) {
    die("Connection failed: " . $conn->connect_error);
}

foreach ($period as $dt) {
    $inputyear = $dt->format("Y");
    $inputmonth = $dt->format("m");


    Sql = " insert into tbl1()select from ... "
    //HERE I JOIN 3 tables
$result = $conn->query($sql);
}
$conn->close();

Its giving me timeout error. I have tried increasing the wamp timeout to 300 as well but it didnot work. How can I optimize above code?

Asiya Fatima
  • 1,388
  • 1
  • 10
  • 20
nas
  • 2,289
  • 5
  • 32
  • 67
  • 1
    I would try to export all the data you need to insert as SQL into one file and then piping that into the MySQL client from the command line. That way you won't have to deal with any PHP execution timeouts. – YetiCGN Dec 13 '18 at 10:52
  • @YetiCGN can you give me example? – nas Dec 13 '18 at 10:55
  • 1
    As it seems to be a one-time reporting task and not something that needs to be part of any application, I'd recommend to not use PHP at all for this but using a nice database tool (MySQL Workbench, HeidiSQL, or even phpMyAdmin) to export the results of your queries to SQL files. Then use `mysql database -u user -p < your_dump_file.sql` on the command line. – YetiCGN Dec 13 '18 at 10:59
  • In the best scenario I would insert in [batch](https://stackoverflow.com/a/780046/8068675) using a [queue worker](https://github.com/phpredis/phpredis) – Clément Baconnier Dec 13 '18 at 11:00
  • [too late edit] wrong link for the queue worker: https://github.com/javibravo/simpleue – Clément Baconnier Dec 13 '18 at 11:07

2 Answers2

0

You should avoid using queries in the loop. I would recommend to use one query which gets all the data at once and parse in on PHP side.

I would also recommend to check indexes. Multiple joins are very heavy when indexes not used. See https://dev.mysql.com/doc/refman/8.0/en/explain.html

In case that you need this data accessible in runtime, please consider the abilities for denormalisation of your table structure.

Alex
  • 4,621
  • 1
  • 20
  • 30
0

The worst solution

set_time_limit(0);

Best practices: try to delegate this task to background worker (think use Queue system)