I would like to get idea on how to process a while loop which consist of more than 3000 rows
while ($row = mysql_fetch_array($result)) {
if $row["Status"] == 1 {
//Curl 3PP and Update DB
} else {
//Curl 3PP and Update DB
}
}
Since the row of the result is huge and the while loop is taking around 1 hour to complete process it. Is there any suggestion to expedite the process?
I was thinking of having another PHP to spawn workers to process the CURL and UpdateDB part. Something like this :
while ($row = mysql_fetch_array($result)) {
if ($row["Status"] == 1) {
exec("php curl_update.php?database=success");
} else {
exec("php curl_update.php?database=fail");
}
}
Is it a good idea or is there any other ways to fulfill the following requirement using PHP or any other library?
In the curl command, i'll be sending transactionID to a 3rd party API to get the status whether it's fail or success. Latency is around 1-5 seconds