3

I've built a javascript application which has a graph with some dropdown filters that users are able to change. The dropdowns all have event listeners which submit a server request to get the data (via a jquery ajax call) and then graphs the data. The issue is if the user uses the keyboard to quickly go through many different elements of the dropdown.

The server call takes roughly a second so if they quickly scroll through say 20, this can lead to a buildup. 20 different requests to the server are created, and then there's not even a guarantee that the last piece of code to be executed on server request success will be the most current filter.

So my question is what is the best way when a filter is changed to kill all other asynchronous processes? Here's the relevant code:

$("#filter").change(function(d) {
  getData();
} //end of if
});

function getData() {
  ...
  $.ajax({
    url: myUrl
    type: "GET",
    dataType: "json",
    success: function(d) {
        [do some stuff]
      } //end of success function
  }); //end of ajax
} //end of getData
vsync
  • 118,978
  • 58
  • 307
  • 400
zachvac
  • 650
  • 1
  • 6
  • 17
  • You have an extra curly bracket in that first function saying it closes an `if`. – AtheistP3ace Jun 15 '17 at 19:13
  • You should wait until the currently running request is finish before sending a new one (and leave out some request that might occur while the current one is running). Even if you would use the `abort` functionality of `XMLHttpRequest2` this could still lead to a unnecessary high server load. – t.niese Jun 15 '17 at 19:13
  • Possible duplicate of [Abort previous AJAX call when a new one made?](https://stackoverflow.com/questions/9285271/abort-previous-ajax-call-when-a-new-one-made) – Heretic Monkey Jun 15 '17 at 20:44

4 Answers4

2

Save all the Ajax calls you wish to abort into some variable then you can abort the last call.

This is a very common practice when some ajax call might happen many times, before the ones before it had a chance to finish.

function getData(){
    $filter.data('REQ') && $filter.data('REQ').abort(); // abort the last request before creating a new one

    return $.ajax({
        url:myUrl
        type:"GET",
        dataType:"json",
        success:function(d){
            [do some stuff]
        }
    })
}

var $filter = $("#filter");

$filter.on('change', function(d) {
    $filter.data('REQ', getData())
});

Of-course this is a very simplified manner of code and you should re-write this in a more structured way, but it gives you an idea of how to cache the last ajax request and then you have the power to abort it before sending a new call.


By the way, your title has nothing to do with the question. You are asking how to handling a sequence of ajax calls but asking about events. this question has nothing to do with events, so you should change the title to fit the problem.


Update regarding what @t.niese had said in the comments:

Throttling the requests on the client side is also a good play to do, since the server cannot really know if the client has aborted the request or not, and if it's a resource-demanding request, it should be throttled. BUT, I would suggest throttling the requests on the server-side and not on the client-side, if possible, because client-side throttling could be by-passed and is not 100% reliable, and it "costs" the same amount of time to do that on the server-side.

Can a http server detect that a client has cancelled their request?

vsync
  • 118,978
  • 58
  • 307
  • 400
  • 1
    I modified the code a bit, just stored the ajax request to a variable with global scope and called abort on that variable at the beginning of getData, but the approach worked perfectly thanks. – zachvac Jun 15 '17 at 20:57
  • It should be mentioned that `abort` will only stop the request on the client side, but once it hit the server the server might still need to do the whole processing for the request, even if it was aborted. So it might not have an effect on the server load, and if this results in a bottle neck, then `abort` wont help, and the number of emitted requests should be throttled. @zachvac – t.niese Jun 16 '17 at 03:58
  • @t.niese - it depends how quickly are you calling the `getData` method. if you rapidly calling it then the request wouldn't have enough time to get out to the server. the server can and should also detect by itself if there's a rapid flow of the same requests (per user ID) and should throttle it **on the server** simply because client-side throttling is worthless when one can simply bypass it easily or by man-made-bug. – vsync Jun 16 '17 at 11:30
-1

I would drop all attempts initiate a new request calling getData while a current request is running, and only send the last getData attempt as soon as the current request is finished. This will ensure that the server load wont become unnecessarily high, because only one request will run.

var currentRequest;
var resubmitRequest;

function getData() {

  // only start a new request id no current request is running
  if (!currentRequest) {
    resubmitRequest = false;
    currentRequest = Promise.resolve($.ajax({
      url: myUrl
      type: "GET",
      dataType: "json"
    })); //end of ajax

    currentRequest
      .then((d) => {
        [do some stuff]
      })
      .finally(() => {
        // if the current request finished and another attempt to request data
        // happened while this request was running then call getData again
        if (resubmitRequest) {
          getData()
        }
      })

  } else {
    // store the information that the data has to be requested 
    // another time after the currently running request finished
    resubmitRequest = true;
  }

  return currentRequest;
} //end of getData
t.niese
  • 39,256
  • 9
  • 74
  • 101
-1

you can simply pause it for example to 500mil seconds and then run 'last' change and do that ajax call..

$("#filter").change(function(d) {
  if( $.active > 0 ) return false;
  // do not run new ajax if last one is not finished
  clearTimeout( window.timer ); 
  // if new 'change' evt is raised in less then 500 mils
  // then clear it and run 'last' one
  window.timer = setTimeout( function(){
    getData();
  }, 500 );
});

in case if user changes it while doing ajax, return false it :)

Kresimir Pendic
  • 3,597
  • 1
  • 21
  • 28
-1

With little effort you can code your own handler for these cases:

function handler(ms, fn) {
    var eventId;
    return function () {
        // if there is an event programmed, kill it
        clearTimeout(eventId);
        // and program the new one (replace the event)
        eventId = setTimeout(fn);
        // if passed the time no event was programmed, setTimeout is going to execute the function
    }
}

// execute getData if no event is fired in a 1000ms interval (the user stopped typing)
// getData is the same that you have
var fn = handler(1000, getData);

$("#filter").change(fn);   
luisenrike
  • 2,742
  • 17
  • 23