The common way would be to render the output first, then flush output to client using flush()
and then do the time comsuming query. Also you should know about ignore_user_abort()
. This function keeps PHP running although the connection to the client may have been ended. (e.g user closes browser)
I've prepared two scripts that illustrate this. First is slow.php
which flushes output early and then starts a time consuming task. The second is get.php
which uses libcurl to recieve the page. If you test it, the get.php will return almost immediately while the slow.php is still running. I also have tested the slow php with current Mozilla.
slow.php:
// The example will not work unless ob_end_clean() is called
// on top. Strange behaviour! Would like to know a reason
ob_end_clean();
// disable all content encoding as we won't
// be able to calculate the content-length if its enabled
@apache_setenv('no-gzip', 1);
@ini_set('zlib.output_compression', 0);
@ini_set('implicit_flush', 1);
header("Content-Encoding: none");
// Tell client that he should close the connection
header("Connection: close");
// keep the script running even if the CLIENT closes the connection
ignore_user_abort();
// using ob* functions its easy to content the content-length later
ob_start();
// do your output
echo 'hello world', PHP_EOL;
// get the content length
$size = ob_get_length();
header("Content-Length: $size");
// clear ob* buffers
for ($i = 0; $i < ob_get_level(); $i++) {
ob_end_flush();
}
flush(); // clear php internal output buffer
// start a time consuming task
sleep(3);
get.php
// simplest curl example
$url = 'http://localhost/slow.php';
$ch = curl_init($url);
$fp = fopen("example_homepage.txt", "w");
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);