6

Suppose there are two scripts Requester.php and Provider.php, and Requester requires processing from Provider and makes an http request to it (Provider.php?data="data"). In this situation, Provider quickly finds the answer, but to maintain the system must perform various updates throughout the database. Is there a way to immediately return the value to Requester, and then continue processing in Provider.

Psuedo Code

Provider.php 
{
   $answer = getAnswer($_GET['data']);
   echo $answer;
   //SIGNAL TO REQUESTER THAT WE ARE FINISHED
   processDBUpdates();
   return;
}
jW.
  • 9,280
  • 12
  • 46
  • 50

7 Answers7

2

You can flush the output buffer with the flush() command.
Read the comments in the PHP manual for more info

Peter Olsson
  • 1,302
  • 3
  • 13
  • 19
2

I use this code for running a process in the background (works on Linux).

The process runs with its output redirected to a file.

That way, if I need to display status on the process, it's just a matter of writing a small amount of code to read and display the contents of the output file.

I like this approach because it means you can completely close the browser and easily come back later to check on the status.

Community
  • 1
  • 1
Mark Biek
  • 146,731
  • 54
  • 156
  • 201
1

I think you'll need on the provider to send the data (be sure to flush), and then on the Requester, use fopen/fread to read an expected amount of data, so you can drop the connection to the Provider and continue. If you don't specify an amount of data to expect, I would think the requester would sit there waiting for the Provider to close the connection, which probably doesn't happen until the end of it's run (ie. all the secondary work intensive tasks are complete). You'll need to try out a few POC's..

Good luck.

DreamWerx
  • 2,888
  • 1
  • 18
  • 13
1

You basically want to signal the end of 1 process (return to the original Requester.php) and spawn a new process (finish Provider.php). There is probably a more elegant way to pull this off, but I've managed this a couple different ways. All of them basically result in exec-ing a command in order to shell off the second process.

adding the following > /dev/null 2>&1 & to the end of your command will allow it to run in the background without inhibiting the actual execution of your current script

Something like the following may work for you:

exec("wget -O - \"$url\" > /dev/null 2>&1 &"); 

-- though you could do it as a command line PHP process as well.

You could also save the information that needs to be processed and handle the remaining processing on a cron job that re-creates the same sort of functionality without the need to exec.

Michał Perłakowski
  • 88,409
  • 26
  • 156
  • 177
0

I'm going out on a limb here, but perhaps you should try cURL or use a socket to update the requester?

Extrakun
  • 19,057
  • 21
  • 82
  • 129
0

Split the Provider in two: ProviderCore and ProviderInterface. In ProviderInterface just do the "quick and easy" part, also save a flag in database that the recent request hasn't been processed yet. Run ProviderCore as a cron job that searches for that flag and completes processing. If there's nothing to do, ProviderCore will terminate and retry in (say) 2 minutes.

Michał Niedźwiedzki
  • 12,859
  • 7
  • 45
  • 47
0

You could start another php process in Provider.php using pcntl_fork()

Provider.php 
{
    // Fork process
    $pid = pcntl_fork();

    // You are now running both a daemon process and the parent process
    // through the rest of the code below

    if ($pid > 0) {
        // PARENT Process
        $answer = getAnswer($_GET['data']);
        echo $answer;    
        //SIGNAL TO REQUESTER THAT WE ARE FINISHED
        return;
    }

    if ($pid == 0) {
        // DAEMON Process
        processDBUpdates();
        return;
    }

    // If you get here the daemon process failed to start
    handleDaemonErrorCondition();
    return;

}
John Foley
  • 4,373
  • 3
  • 21
  • 23