0

I have to execute 100+ php scripts to be executed in parallel, each php script insert new data in database and update some previous records.

For example, each script url would be like that:

example.com/update.php?data-from-project-1.com
example.com/update.php?data-from-project-2.com
example.com/update.php?data-from-project-2.com
So on...

And Update page work something like:
Update.php

<?php
//insert some new records from project data
//update some records
?>

I am trying to do:
Cron JOB PAGE:

$urls=array("data-from-project-2.com", "data-from-project-2.com",....)
for ($index = 0; $index < count($urls); $index++) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "update.php?$urls[$index]");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch); }

But i guess this will execute all pages in order and can produce timeout issue as well. Anyway to execute them all in parallel through cron job page. I don't need any output, they just need to update database. Any help would be appreciated. Thanks

3 Answers3

0

I'd love to pretend this was my reply but this answer from SO should cover it.

Executing multiple PHP scripts in parallel, and being notified when finished

Community
  • 1
  • 1
atlaz
  • 129
  • 3
0

use Php Thread. Simple example:

<?php

class workerThread extends Thread {
public function __construct($i){
  $this->i=$i;
}

public function run(){
  while(true){
   echo $this->i;
   sleep(1);
  }
}
}

for($i=0;$i<50;$i++){
$workers[$i]=new workerThread($i);
$workers[$i]->start();
}

?>

you can write each update in each thread. But i offer you create some batches for each thread. Because each thread also uses some little time for starting. Or each thread may use some part of global array variable "urls" (start_index, end_index)

Ramin Darvishov
  • 1,043
  • 1
  • 15
  • 30
0

You can use PHP's curl_multi_init() to handle your tasks asynchronously.

I highly recommend this tutorial

MonkeyZeus
  • 20,375
  • 4
  • 36
  • 77