-1

I have 1,00,000 record in my table A and I need some data of A insert B table through Corn with some other data. So In implement below code and set pagination rule but every new page it taking time.

Every 10000 record (it's a page limit) page change and it's increase process time. how to reduse process time.

$DB = new myclassi();

$resource_new = "SELECT COUNT(`gi`.`igems_item_id`) as total 
FROM `gems_item` AS `gi` 
WHERE NOT EXISTS (SELECT `gicd`.`id` FROM `gems_item_cron_data` As `gicd` WHERE `gi`.`old_sku` = `gicd`.`gems_old_sku` AND `gi`.`igems_item_id` = `gicd`.`gems_itemid`)
AND `gi`.`diamond_video_url` IS NOT NULL 
AND (`gi`.`diamond_video_url` LIKE '%http%' OR  `gi`.`diamond_video_url` LIKE '%https%')";
$res = $DB->select_assoc($resource_new);
$total_rows = $res[0]['total'];


$no_of_records_per_page = 10000;
$total_pages  =  ceil ($total_rows / $no_of_records_per_page);
$limit = "";
$pageno = 1;
 

for($i=0;$i < $total_pages; $i++){
    
    $offset = ($pageno) * $no_of_records_per_page;
    if ($pageno > 1) {
        $prev_offset = ($offset - $no_of_records_per_page) + 1;
        $limit = $prev_offset.", ".$no_of_records_per_page;
    }else{
        $limit = "0, ".$no_of_records_per_page;
    }
    
    $sqlr="SELECT `gi`.`igems_item_id`, `gi`.`old_sku`, `gi`.`diamond_video_url` 
    FROM `gems_item` AS `gi` 
    WHERE NOT EXISTS (SELECT `gicd`.`id` FROM `gems_item_cron_data` As `gicd` WHERE `gi`.`old_sku` = `gicd`.`gems_old_sku` AND `gi`.`igems_item_id` = `gicd`.`gems_itemid`)
    AND  `gi`.`diamond_video_url` IS NOT NULL 
    AND (`gi`.`diamond_video_url` LIKE '%http%' OR  `gi`.`diamond_video_url` LIKE '%https%')  LIMIT ". $limit;
    $results = $DB->select_assoc($sqlr);                    
    if (isset($results) && !empty($results)) {
        foreach ($results as $row) {
            $url = urldecode($row['diamond_video_url']);
            $k = $row['igems_item_id'];
            $video_url = 'xxxfdafas';
            $old_sku = $row['old_sku'];
            $diamond_video_url = $row['diamond_video_url'];
            $diamond_final_video_url = urlencode($video_url);
            $is_url_found = '';
            $is_processed = 0;
            $created_at = date('Y-m-d H:i:s');

            $sql = "INSERT INTO gems_item_cron_data (`gems_itemid`,`gems_old_sku`, `diamond_final_video_url`,`is_processed`,`created_at`) 
            VALUES ($k, '$old_sku','$diamond_final_video_url', '$is_processed', '$created_at')";
            $DB->insert($sql);
        }
    }
    $pageno++;
}
$DB->close();

Please let me know how to make faster insert process.

Abid Hussain
  • 7,724
  • 3
  • 35
  • 53
  • 1
    By "corn", do you mean "cron"? – Chris Haas Jun 18 '21 at 21:11
  • [MySQL supports a single statement with multiple rows to INSERT](https://stackoverflow.com/a/12502067/231316). [Transactions](https://stackoverflow.com/a/15149622/231316) are also an option. – Chris Haas Jun 18 '21 at 21:14

1 Answers1

0

Simplify test

This

      AND  `gi`.`diamond_video_url` IS NOT NULL
      AND  (`gi`.`diamond_video_url` LIKE '%http%'
              OR  `gi`.`diamond_video_url` LIKE '%https%'
           )";

can be simplified, and sped up significantly, to this

     AND  `gi`.`diamond_video_url` LIKE 'http%'

especially if the URL necessarily starts with "http".

Eliminate COUNT

I suggest there is no need to first count the rows. Simply do the SELECT to fetch the rows, then notice whether you got any rows. That, alone, will (probably) more than double the speed of the program.

Batch INSERT

If practical, replace the loop and single-row inserts with

INSERT INTO ... (...)
    SELECT ...

That might give another factor of 10 in speedup.

If the SELECT returns no rows, there will be nothing to INSERT. Real cheap.

After you have done those, there may be further optimizations, but then I would need to see the new query, plus SHOW CREATE TABLE.

Rick James
  • 135,179
  • 13
  • 127
  • 222