2

I have form when a user can input data to find some products in a base. Query have got some unions and a lot of joins/subqueries to connect data correctly so retrieving products could take over a dozen seconds (longest runs). If the user changes the data for search and clicks submit button second time when one query is currently running, the second query is waiting to accomplish the first one.

Is there an option to kill the first running query (to not wait for it to accomplish) to speed up searching when the user wants to change its mind and find other products?

It's worth to add that I'm using a PDO interface to fetch data (sorry that I did not mention it earlier).

  • 1
    Never have come across a solution myself (other than manually killing a process). I don't think `connection_aborted()` would work in this case... ? – IncredibleHat Jul 10 '18 at 18:14
  • For MySQL, take a look to this answer posted by @Petah [Kill MySQL query on user abort](https://stackoverflow.com/questions/7582485/kill-mysql-query-on-user-abort) – acarlstein Jul 10 '18 at 18:59
  • @IncredibleHat - unfortunately `connection_aborted()` seems to work when script is finished. So I can check that user aborted (made a new) query but only when the second query is finished. So probably it's not gonna help me. – Kamil Czubak Jul 11 '18 at 17:12

2 Answers2

0

For MySQL:

Take a look to @Petah answer in Kill MySQL query on user abort

For PostgreSQL:

If you are doing asynchronous queries with PostgreSQL database, then there is a method to do that:

pg_cancel_query()

bool pg_cancel_query ( resource $connection )

pg_cancel_query() cancels an asynchronous query sent with pg_send_query(), pg_send_query_params() or pg_send_execute().

Note: You cannot cancel a query executed using pg_query().

acarlstein
  • 1,799
  • 2
  • 13
  • 21
  • 2
    Take a look to this answer posted by @Petah [Kill MySQL query on user abort](https://stackoverflow.com/questions/7582485/kill-mysql-query-on-user-abort) – acarlstein Jul 10 '18 at 18:17
  • OP is not using PostGres though, it's tagged as MySQL. If the database is complex enough to need a number of union and join statements as posted, it is unlikely that migration is viable even with good reason. – mopsyd Jul 10 '18 at 18:31
0

I found a solution. I don't know how good is it but working for now. So I read (PHP Session Locking: How To Prevent Sessions Blocking in PHP requests) that I can not make another request to MySQL because of session locking. Using session_write_close(); session has been closed.

After this change, I can make second (and more) request but with SHOW FULL PROCESSLIST executed in PHPMyAdmin I still see my long running process which is still running of course. The solution was to kill the process from the session and rewrite the session with actual process ID.

// make a connection
include("connection.php"); 

// if process id exist in session and in process list then remove
if ($_SESSION['connectionID'] != '')
{
    $sql = $conn->prepare("SHOW FULL PROCESSLIST");
    $sql->execute();
    $resultArray = $sql->fetchAll(PDO::FETCH_ASSOC);

    foreach($resultArray as $row)
    {
        if ($row['Id'] == $_SESSION['connectionID'])
        {
            $sql = $conn->prepare("KILL ".$_SESSION['connectionID']);
            $sql->execute();
        }
    }
}

// set actual process id in the session
$sql = $conn->prepare("SELECT CONNECTION_ID()");
$sql->execute();
$resultArray = $sql->fetchAll(PDO::FETCH_ASSOC);
$_SESSION['connectionID'] = $resultArray[0]['CONNECTION_ID()'];

After this changes, if some process is still running but is not used, it will be killed.