0

I have something like this: loop through the ids and for each one, make an ajax request (async:true) to the server (same domain) and append received data to a DOM element. Not a hard task to accomplish, it works. Example code:

$.each(ids, function (index, id) {
    $.ajax({
        type: 'POST',
        url: "http://localhost/example/"+id,
        success: function (data) {
            $('#content').append(data);
        }
    });
});

My problem is: when I have to many IDs to loop through (1000 for example), it sends a lot of ajax, and the browser "freeze", when I click on a link, it waits all ajax requests to finish, then is opens the clicked link.

Timeout isn't an option, because I need to show as many received data as possible. If it takes 1 second to finish or if it takes 100 seconds, no problem with this, because the user will see the others requests that has finished

So, what's the best way to handle this ? Stop the ajax requests when a link is clicked like this ? Or there's a better way to do this

Community
  • 1
  • 1
Matheus Vellone
  • 79
  • 1
  • 2
  • 8
  • 7
    If you're potentially sending anywhere remotely close to up to 1000 ajax requests, then the problem is not the browser freezing, it's that your application is horribly architected. Find a better way. – Adam Jenkins Mar 12 '15 at 16:19
  • 3
    I would say: build a list of IDs and send the entire list to the server all at once. – Martijn Arts Mar 12 '15 at 16:20
  • 1
    A further FYI - the browser can only handle 6 simulatenous max connections. If it receives more, it queues them up and processes them in order (when one connection is finished, it will open a new one, to a maximum of 6 simultaneous connections). This is why you can't load the response from a link when you're running hundreds of connections - the browser has to wait until there is no more than 5 connections opened/queued so it can open a new one (the 6th one) to load the content for the link that you just clicked. – Adam Jenkins Mar 12 '15 at 16:22
  • The browser will limit the number of open requests. For instance firefox will only do 6 I believe. So, your requests are going to take a long time if there are a thousand of them. You should do your appends in batches. Send 20 or more ids at a time. Then it would still look dynamic, but not clog up the works. – dudeman Mar 12 '15 at 16:23
  • As the first two commenters have noted, you are approaching the problem from the wrong way. Instead of sending 1000 individual AJAX requests, you can submit your IDs as an array and allow the server to do the heavy lifting. Then again, it really depends *what* are you exactly doing with the 1000 requests. Are they simple, or complicated? Are you querying against a database? If so, is the database appropriately indexed? – Terry Mar 12 '15 at 16:26
  • I send AJAX individually because I need content related to this ID when server finish proccessing it. If i send all at once, I would need to wait server process all IDs – Matheus Vellone Mar 12 '15 at 16:45

2 Answers2

2

Instead of making a 1000 ajax calls to the application you can have the server return a json of the format

[{ id: 0, content: "A",
id: 1, content: "B",
.
.
id: n, content: "Nth content" }]

With a request like

var idList = [];
$.each(ids, function(index, id) {
    idList.push(id);
});
$.ajax({
type: "POST",
data: { id: idList },
success: function(data) {
    $.each(data, function(index, v) {
        $("#content").append(v.content);
    })
}).error(function(response) {
    $("#content").append(response);
});

Chrome as far as I know can handle 8 concurrent ajax requests at a time, a number like 1000 would be too much for it. The above code would reduce the number of ajax calls made reduce from 1000 to 1.

If however you do want to make 1000 calls like this the you are better off doing something like this

var idList = [];
$.each(ids, function(index, id) {
    idList.push(id);
});
function makeAjax() {
    $.ajax({
        type: "POST"
        data: { id : idList.pop()},
        success: function(data) {
            $("#content").append(data);
        }
    }).done(function() {
        if(idList.length>0) {
            makeAjax();
        }
    });
}

The code below spawns a max of 8 ajax calls at a time.

var idList = [];
$.each(ids, function(index, id) {
    idList.push(id);
});
var MAX_THREADS = 8;
var CURRENT_THREADS = 0;

function makeAjax() {
    CURRENT_THREADS++;
    $.ajax({
        type: "POST"
        data: { id : idList.pop()},
        success: function(data) {
            $("#content").append(data);
        }
    }).done(function() {
        CURRENT_THREADS--;
    });
}

function callAjax() {
    if(CURRENT_THREADS < MAX_THREADS) {
        makeAjax();
    }
    if(idList.length > 0) {
        setTimeout(callAjax, 1000);
    }
}

callAjax();

However I don't recommend doing this. You are better off with some other solution like pagination if fetching the content for all ids at once takes a long time.

Matheus Vellone
  • 79
  • 1
  • 2
  • 8
Prathik Rajendran M
  • 1,152
  • 8
  • 21
  • That's the perfect solution! BUT if I send all IDs in one ajax request, the user will need to wait the server to proccess all the IDs to show some content. I nedd to show content when server finish proccessing the ID. That's why i can't send all IDs together. – Matheus Vellone Mar 12 '15 at 16:42
  • Updated the answer, this sends one id at a time, will update with a way to send more concurrently. – Prathik Rajendran M Mar 12 '15 at 16:46
  • Man! That's what I was looking for! I guess this will work like a charm! Thanks!! :D – Matheus Vellone Mar 12 '15 at 16:52
0

A better approach would be to change the way you're requesting data.

If at all possible, try and change the server to accept more than one ID and make one request with all of the IDs at once. This way, you're looping through data instead of AJAX requests.

Think about it. For an array of 1000 ID's, you're making at least 1001 requests (1 for the page!) every time you hit that function. If that request is backed by a database, that's 1000 separate database hits.

A modern web browser will only handle so many simultaneous AJAX requests before queuing the rest up for processing later. There's probably no good way of increasing performance in the code you've provided.

villecoder
  • 13,323
  • 2
  • 33
  • 52
  • But, if I send the entire ID array via ajax, I can't show the results one by one. The idea is also show some information associated with the ID asap, while processing the others. – Matheus Vellone Mar 12 '15 at 16:36
  • You need to iterate through the result set using $.each (like Prathik's example). Maybe you've got some more code that you didn't include? But for a simple $("#content").append(data), it shouldn't be that hard. – villecoder Mar 12 '15 at 16:41
  • Each of this ajax takes 1~2 seconds when response is not cached. 1000 (maybe more) would take to long to wait for the response. – Matheus Vellone Mar 12 '15 at 16:47