0

I have an array of objects passed from javascript named 'results' where every object has 3 properties 'country_id', 'national_id' and 'blood_group'. By the below running code I do update 'blood_group' column data of every individual row by matching 'country_id' and 'national_id' from blood_group_table. But this is not efficient for updating say 2 Million rows as it is executing individual update for each row. I tried doing concatenation of all sql update query in the for loop and call to server only once. That improved performance a lot, but I need to find something more efficient.

<?php
ini_set('max_execution_time', 30000000); 
set_time_limit(0);

$results = $_POST['results'];

foreach($results as $item)
     {
        $obj = new stdClass();
        $obj->blood_group = $item['blood_group'];
        $obj->country_id = $item['country_id'];
        $obj->national_id = $item['national_id'];

        $pdo = connect();

        $sql_update = "UPDATE blood_group_table
                       SET blood_group = :blood_group
                       WHERE country_id = :country_id AND national_id = :national_id;";

        try {
             $update_query = $pdo->prepare($sql_update);

             $update_query->execute(
                array(
                    'blood_group' => $obj->blood_group,
                    'country_id' => $obj->country_id,
                    'national_id' => $obj->national_id
                )
            );

             $update_query = null;

            $response = "Successfully inserted.";
        } catch (PDOException $e) {
            $response = "Error occurred." . $e->getMessage();
        }
    }
?>
mnur
  • 7
  • 4
  • "Best" is always subjective. But if I need to perform a bulk task like this, I would either use a [transaction ](https://stackoverflow.com/a/10642752/231316) and batch 500 or 10,000 together (you need to experiment to see what works for your code) or I would concatenate 500 or 10,000 SQL statements together and send them all at once. I would also add logic to make sure 2,000,000 updates are actually required, maybe by pre-fetching all of them and comparing locally, only sending changes if needed. – Chris Haas May 13 '20 at 13:17
  • Move the DB connection and prepare outside of the `foreach` loop to start with. You only need to connect and prepare one time. If your DB engine is InnoDB start a transaction before your `foreach`, do all of the updates then close the transaction. You'll be amazed at how quickly the job goes. If your DB engine is NOT InnoDB then you will need to do things in batches but the connection and prepare still only need to be done once. – Dave May 13 '20 at 13:47

0 Answers0