-1

My application is a digital signage player. We need to send remote proof of play data back to the cloud on a regular basis. Proof of play data is cached to help with network interruptions.

A player can easily create 6,000 POP records in a day. When the network comes back up, this can cause a pretty serious delay moving from one presentation to the other.

I was looking at using the async mode of MySql to try and alleviate that delay... But I don't think it does what I think it does.

I'm I'm looping over a large result set, I'm still going to have to wait for that loop to complete. Do I bundle up 6,000 inserts and send them all at once? At what point does my AJAX return a result if I'm still waiting on an async database call? Does it just plain not work this way? Google returns less than a page of results, and the Docs don't really shed any light.

$sql = "SELECT MAX(time) from `pop` where `displayId` = ".$_GET['id'];
$res = $local->query($sql);
$row = $res->fetch_row();
//$log->lwrite("Last Updated is ".$row[0]);

$sql = "SELECT * FROM `pop` 
   WHERE `displayId` = ".$_GET['id']
   ." AND `time` > '".$row[0]."' LIMIT 1000";
$local->query($sql, MYSQLI_ASYNC);
if ($res = $local->reap_async_query()) {
    while ($row = $res->fetch_row()) {
        $log->lwrite(print_r($row, 1));
        $sql = "INSERT INTO `pop` 
          (`displayId`, `time`, `duration`, `presId`) 
          VALUES ('".$row[1]."', '".$row[2]."', '".$row[3]."', '".$row[4]."')";
        $remote->query($sql, MYSQLI_ASYNC);
    }
}

I've already resigned myself to the notion that I'm going to have to do this a different way. Now I'm just curious. :)

Am I missing something here?

Thanks for your time and help.

Your Common Sense
  • 156,878
  • 40
  • 214
  • 345
Jason Maggard
  • 1,632
  • 1
  • 16
  • 21
  • 2
    there's no point in doing an async with your code as-is. you fire off the query, and then just start looping waiting for it to return. you might as well fire off a synchronous query and wait for that call to come back. – Marc B Dec 19 '14 at 21:44
  • Thanks Marc. I was hoping to keep the client end simple by limiting the number of interfaces, but it's easily solved by moving POP to it's own API/timing loop. Unless I'm missing something the API call wouldn't return until the ASYNC call was already reaped no matter the format. I'm just wondering how ASYNC really does anything useful for a server side language. – Jason Maggard Dec 21 '14 at 02:48
  • A sad textbook example of XY problem – Your Common Sense Oct 30 '20 at 18:45
  • Voted down for sql injection vulnerabilities as pointed out by YCS – PHP Guru Oct 30 '20 at 18:55

2 Answers2

0

Are you talking about the network between your php server and the DBMS? In that case you have really not understood async mysqli.

The point of using async queries is to apply DML without blocking your PHP code. It is NOT intended to queue commands to a system which may be offline regularly. Nor should you be exposing a DBMS directly on the internet. It provides a window in which you do other things with your php code (like running other queries, or sending data to the client). It is not intended to carry out operations on the data after the script has completed.

There are lots of ways to solve the problem, most of which involve running daemons at both ends of the connection - this could be a major constraint for most people. You have not said what control you have over the systems at each end. A compromise solution would be to use from jobs on the satellite systems. Certainly you shouldn't be trying to replicate the data in the handler for a web request.

Are you trying to replicate the entire table locally? If so, why not use MySQL's built-in replication (over a VPN link).

symcbean
  • 47,736
  • 6
  • 59
  • 94
  • The database connection is one to many, so replication doesn't work. I am not trying to queue the command via ASYNC... When I query for schedule data, the requst also sends back proof of play. After a long network outage we want to recover quickly, but the AJAX call is slowed down by doing numerous inserts, adding 5-10 seconds even uploading in relatively small batches. I want to keep it as responsive as possible without having to limit the batches too much. – Jason Maggard Dec 21 '14 at 02:41
-3

The first thing wrong with your code is that you shouldn't execute your INSERT statements in a loop. The right way to do this is to insert multiple records into the database in a single INSERT statement. This is limited only by MySQL's max_allowed_packet setting which limits the length of the query. Usually this is quite large and could easily accommodate 6000 records.

$sql = "SELECT * FROM `pop` 
   WHERE `displayId` = ".(int)$_GET['id']
   ." AND `time` > '".$local->real_escape_string($row[0])."' LIMIT 1000";
$res = $local->query($sql) or die($local->error);// no async
$sql = '';
while ($row = $res->fetch_row()) {
    $row = array_map(array($local, 'real_escape_string'), $row);// must have sanitization
    if ($sql) $sql.= ',';// separate each record with a comma
    $sql.= "('".$row[1]."', '".$row[2]."', '".$row[3]."', '".$row[4]."')"
    $log->lwrite(print_r($row, 1));
}

$sql = "INSERT INTO `pop` 
  (`displayId`, `time`, `duration`, `presId`) 
  VALUES $sql";
$remote->query($sql, MYSQLI_ASYNC);// Do only one INSERT query instead of thousands

If you use MYSQLI_ASYNC here, PHP won't wait for the insert to complete and your AJAX call will finish without waiting for the remote database query to finish.

PHP Guru
  • 1,301
  • 11
  • 20