I recently searched for a way to implement some unblocking Ajax requests. I found many answers including this and this that both satisfied my needs. After implementing the later solution, I tried to send an Ajax request and then calling another one so the first one aborted (so doesn't make user wait) and the second one runs.
But I come across this situation that although tools like firebug shows that the first request is aborted, but the second request execution would take as long as the time needed for execution both requests.
To make it more readable, assume that "request1" takes 10 seconds to execute and "request2" takes 5 second. Now if I abort "request1" in second 5, then "request2" will take approximately 10 seconds (5 second from "request1" and 5 from "request2") to execute.
I monitor the Apache server execution and my first guess is that when the request is sent to server, no matter we abort it or not in client-side, the server resources will be consumed and the only thing that happens is that client doesn't expect the result because the requester(request1 in my example) is aborted.
Now my question: If my guess is right, Is there anyway to let server know to stop the current process and start the next one(this question has no answer)?
Or if our server has more than processing power, how can we let the new requests execute by another processor?
here is my code for further investigation:
//Ajax Request Queue
var ajaxQueue = [];
ajaxQueue.push(//Push in Ajax Queue
$.ajax({
type: 'POST',
cache: false,
url: url,
data: {'token': $.cookie('s3_ctoken')},
dataType: 'html',
beforeSend: function (xhr, setting) {
},
success: function (res, textStatus, xhr) {
},
error: function (xhr, textStatus, thrownError) {
},
complete: function () {
},
async: true
})
);
//On next ajax request
if(ajaxQueue.length > 0){//Means there is some ajax requests are running
for(var i = 0; i < ajaxQueue.length; i++)
ajaxQueue[i].abort();
}