Try to abort your ajax request before opening another url. Example with jQuery:
var request= $.ajax({
url: "test.php"
}).done(function() {
/* do something */
});
//kill the request
request.abort();
You could kill the request everytime a link is clicked, e.g:
$('a').on('click', function() {
request.abort();
});
On the server side concurrent requests aren't queued. However if there is a file lock, e.g. when a php script opens a file for writing, concurrent requests to that file would be queued, and thus the execution of a php script would be stalled.
This has been discussed before:
Simultaneous Requests to PHP Script
It seems that if you use file-based sessions in PHP each session_start();
would cause a file lock. So doing session_write_close();
after your work with the session is done would release that lock.