2

How to execute a php script from another ?

I want to execute 3 php scripts from my php file without waiting for the 3 scripts to finish. In other words, the 3 php files need to be executed all at once (parallel) instead of one-by-one (sequentiell).

The 3 scripts are in the same folder of my main php file (script).

Qantas 94 Heavy
  • 15,750
  • 31
  • 68
  • 83
faressoft
  • 19,053
  • 44
  • 104
  • 146
  • possible duplicate: http://stackoverflow.com/questions/8073109/request-a-php-script-from-another-script-and-move-on?rq=1 – Jocelyn Aug 24 '12 at 22:01

5 Answers5

3

If you do not want to wait for them to finish, run them with either

exec('php script.php &> /dev/null &');
shell_exec('php script.php &> /dev/null &');
system('php script.php &> /dev/null &');
`php script.php &> /dev/null &`

Any of those should accomplish the job, depending on your PHPs configuration. Although they are different functions, their behaviour should be similar since all output is being redirected to /dev/null and the proccess is immediately detached.

I use the first solution in a production environment where a client launches a bash SMSs sending script which can take up to 10 minutes to finish, it has never failed.

More info in: http://php.net/exec · http://php.net/shell_exec · http://php.net/system

  • @faressoft Every line of my example works as expected, just tried them, your server setup has an issue, which is now unrelated to the questions and answers here. –  Aug 24 '12 at 22:16
0

how about using exec("php yourscript.php")

do consider using queuing system to store your php script names and worker to fetch data from queue and do the execution e.g. beanstalkd

amitchhajer
  • 12,492
  • 6
  • 40
  • 53
0

You need to run them as detached jobs, and it is not really easy - or portable. The usual solution is to use nohup or exec the scripts with stdout and stderr redirected to /dev/null (or NUL in Windows), but this often has issues.

If possible, make the three scripts available as scripts on the web server, and access them through asynchronous cURL functions. This has also the advantage of being able to test the scripts through the browser, and supplying you the scripts output.

Other ways include using popen(), or if under Linux, the at or batch utility.

LSerni
  • 55,617
  • 10
  • 65
  • 107
0

taken from http://board.phpbuilder.com/showthread.php?10351142-How-can-I-exec%28%29-in-a-non-blocking-fashion:

In order to execute a command have have it not hang your php script while it runs, the program you run must not output back to php. To do this, redirect both stdout and stderr to /dev/null, then background it.

> /dev/null 2>&1 &

In order to execute a command and have it spawned off as another process that is not dependent on the apache thread to keep running (will not die if somebody cancels the page) run this:

exec('bash -c "exec nohup setsid your_command > /dev/null 2>&1 &"');

For windows http://www.php.net/manual/en/function.exec.php:

function execInBackground($path, $exe, $args = "") {
global $conf;

if (file_exists($path . $exe)) {
    chdir($path);
    if (substr(php_uname(), 0, 7) == "Windows"){
        pclose(popen("start \"bla\" \"" . $exe . "\" " . escapeshellarg($args), "r"));   
    } else {
        exec("./" . $exe . " " . escapeshellarg($args) . " > /dev/null &");   
    }
}
TheUO
  • 115
  • 1
  • 2
0

To the shell_exec, add the '/B' parameter, this allows you to run several executables at once. See my answer at this question: PHP on a windows machine; Start process in background

It's the same.

shell_exec('start /B "C:\Path\to\program.exe"); /B parameter is key here. I tried to find the topic for you again, but I can't seem to find it anymore. This works for me.

I hope this solves the problem for you.

Community
  • 1
  • 1
Jelmer
  • 2,663
  • 2
  • 27
  • 45