1

I want to run more than 300 PHP script simultaneously in background. I tried using exec() one after another. But it runs sequentially. It means first it executes exec() first command then next exec() command. How can I run all PHP script in background at a time ? Current code :

exec("wget -O - http://mywebsite.com/index.php >/dev/null 2>&1");
exec("wget -O - http://mywebsite.com/index3.php >/dev/null 2>&1");
exec("wget -O - http://mywebsite.com/index4.php >/dev/null 2>&1");
exec("wget -O - http://mywebsite.com/index5.php >/dev/null 2>&1"); 

This script first executes index.php entirely and then executes index3,4,5. But I want to run all at a time. Any help would be appreciated.

Thank you!

Regards, John

John McLow
  • 13
  • 3
  • you are using the wget so it will execute one by one you can use crontab command create a schedule job for all php file at one time where your current time would be now – sandeep_kosta Oct 09 '15 at 07:10
  • Take a look at the `nohup` utility. – arkascha Oct 09 '15 at 07:11
  • @bornprogrammer Any example would be fine. – John McLow Oct 09 '15 at 07:25
  • Possible duplicate of [php background process using exec function](http://stackoverflow.com/questions/12842767/php-background-process-using-exec-function) – Matt Gibson Oct 09 '15 at 07:34
  • See the question I've marked as a duplicate. Basically, you do it the same way you'd start off the process in the background directly from the command line -- add an extra ampersand to the end of the command (`wget -O - http://mywebsite.com/index.php >/dev/null 2>&1 &`). Though starting off 300 wgets at once may be a little heavy on the box; as Vivek suggests, you may want to look into task queueing systems that can give you control over the maximum number of simultaneous tasks, etc. Or maybe look into using something other than PHP -- plain shell might be better in this case... – Matt Gibson Oct 09 '15 at 07:35
  • 1
    @arkascha nohup is used to run in background not simultaneously... requirement is to run all php script simultaneously in background – sandeep_kosta Oct 09 '15 at 07:45
  • @JohnMcLow i don't have any code example but can write the process 1) create a cronjob by using the command crontab -e in terminal and put all files using either curl extension or wget and execute it at same time – sandeep_kosta Oct 09 '15 at 07:48
  • Put the list of urls you want in a file and run GNU Parallel like this `parallel -j 32 -k -a theFile 'wget -O - {} '` to do 32 in parallel at a time and keep outputs in order. Or echo the urls to GNU Parallel's stdin and omit the `-a theFile` part. – Mark Setchell Oct 09 '15 at 08:22

1 Answers1

0

You can look at using php-rescue for the purpose. celery-php can also work for you.

Vivek Srivastava
  • 569
  • 4
  • 13