0

Possible Duplicate:
how can I achieve a task that should be done in thread in php

Normally when running a PDO query in PHP, it waits for the database result. Is there a way to avoid this?

I need my script to reply really fast, but the SQL that runs may take some time to run.

Example:

<?php
$pdo = new PDO('bla bla');

// This takes up to 1 second, but I need the script to reply within a few ms
$pdo->exec('UPDATE large_table SET foo = \'bar\' WHERE id = 42');
die('ok');

Is this doable?

Community
  • 1
  • 1
Lilleman
  • 7,392
  • 5
  • 27
  • 36
  • Does your table have an index on the `id` column? If not, adding one will make the query run pretty much instantly. (Assuming that `id` is, in fact, a unique ID.) –  Jan 23 '13 at 21:30
  • @PeeHaa: I know how to fork other processes in PHP and execute external scripts, this question is supposed to be PDO and/or SQL specific. – Lilleman Jan 23 '13 at 21:52
  • So you cannot fork because? – PeeHaa Jan 23 '13 at 21:52
  • @duskwuff: The example SQL is just for show, the real one is much more advanced. I think I have optimized the real one quite much. However, that is a separate question. :) – Lilleman Jan 23 '13 at 21:52
  • @MikeB: It will stack them, but it does not matter for my scenario. (API calls, not browser loads) – Lilleman Jan 23 '13 at 21:53
  • @PeeHaa: I can, but I'm looking for a prettier solution. Like the delayed insert below, but for updates. So I'm mostly curious :) – Lilleman Jan 23 '13 at 21:54
  • Update should not take too much time. You can "Analyze' or 'optimize' the table. – kwelsan Jan 24 '13 at 08:36
  • kwelsan: did it come to your mind that an optimize table can run for days ? Not the right solution. – John Jul 01 '18 at 22:35

3 Answers3

2

For INSERT queries, you can use INSERT DELAYED (Manual). These queries will be placed into a workqueue and return instantly. The downside is that you don't have any feedback on whether the query was executed successfully.

For obscure reasons however, there is no UPDATE DELAYED...

helmbert
  • 35,797
  • 13
  • 82
  • 95
1

The common way would be to render the output first, then flush output to client using flush() and then do the time comsuming query. Also you should know about ignore_user_abort(). This function keeps PHP running although the connection to the client may have been ended. (e.g user closes browser)

I've prepared two scripts that illustrate this. First is slow.php which flushes output early and then starts a time consuming task. The second is get.php which uses libcurl to recieve the page. If you test it, the get.php will return almost immediately while the slow.php is still running. I also have tested the slow php with current Mozilla.

slow.php:

// The example will not work unless ob_end_clean() is called
// on top. Strange behaviour! Would like to know a reason
ob_end_clean();

// disable all content encoding as we won't
// be able to calculate the content-length if its enabled
@apache_setenv('no-gzip', 1);
@ini_set('zlib.output_compression', 0);
@ini_set('implicit_flush', 1);
header("Content-Encoding: none");


// Tell client that he should close the connection
header("Connection: close");

// keep the script running even if the CLIENT closes the connection
ignore_user_abort(); 

// using ob* functions its easy to content the content-length later
ob_start();

// do your output
echo 'hello world', PHP_EOL;


// get the content length
$size = ob_get_length();
header("Content-Length: $size");

// clear ob* buffers
for ($i = 0; $i < ob_get_level(); $i++) {
    ob_end_flush();
}
flush(); // clear php internal output buffer

// start a time consuming task
sleep(3);

get.php

// simplest curl example
$url = 'http://localhost/slow.php';

$ch = curl_init($url);
$fp = fopen("example_homepage.txt", "w");

curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);

curl_exec($ch);
curl_close($ch);
fclose($fp);
hek2mgl
  • 152,036
  • 28
  • 249
  • 266
  • Thanks, but I need the client to get a HTTP 200 ok, and a closed connection. This however works well on browser-initiated jobs :) – Lilleman Jan 23 '13 at 21:59
  • Of course the client will get a http 200 OK! Added the connection close (But it is not really required in HTTP1.1) Trust me, thats exactly what you need – hek2mgl Jan 23 '13 at 22:05
  • Say you're requesting this URL from another PHP script, using file_get_contents(). That will hang and wait for the closed connection. – Lilleman Jan 23 '13 at 22:12
  • In HTTP/1.1 usually the client closes the connection. Have you tested it? – hek2mgl Jan 23 '13 at 22:13
  • Yes, and since it will work in many situations, I will give you a +1 :) It will not solve my original question though. – Lilleman Jan 23 '13 at 22:15
  • Have tested it with file_get_contents() too. Yes it don't work. Will find a solution. (Having the ctx param in mind) Await my update – hek2mgl Jan 23 '13 at 22:22
  • @Lilleman I failed to get it working using file_get_contents().. I think this is a bug. I finally used curl for it. Check my update – hek2mgl Jan 23 '13 at 23:41
0

Call your update script via AJAX showing a loader to your user while you do your update.

If you don't really need the results of the query in the output at page loading time it's the only way.

napolux
  • 15,574
  • 9
  • 51
  • 70
  • 1
    Pragmatic solution. :) The question is more idealistic, however. I'm looking for a solution where PDO or SQL in itself voids the result and just puts the process to the background. – Lilleman Jan 23 '13 at 22:01