0

I have what I would have thought was a very basic issue whose solution is for some reason completely eluding me.

I have an HTML form #sendform with a lot of checkboxes each with the class .userCheckbox. The number of checkboxes is not predictable; it can be anywhere from just one or two to several thousand. Each box represents a user, and if it is checked, that user should receive an e-mail when the form is submitted.

The intended outcome is that submitting the form sends an AJAX request to the PHP backend, adding a class loading to the parent container (which shows a little spinning wheel) when the request is sent, and then replacing that with a success or error class, as relevant, when the request completes. I don’t want to send off the entire bulk of checkboxes at once, since (correct me if I’m wrong) that would entail having to wait for all rows to be processed before getting any visual feedback on the page.

So the very basic skeleton is something like this:

$(document).on('submit', "#sendform", function(e) {
    e.preventDefault();
    var $form = $(this);

    $form.find('input.userCheckbox').each(function(i) {
        var $this = $(this);

        if (this.checked) {
            $.ajax({
                url: $form.prop('action'),
                type: $form.prop('method'),
                dataType: 'json',
                data: { /* ... stuff here ... */ }
                beforeSend: function() { /* Add loading class to parent */ },
                success: function(data, textStatus, jqXHR) { /* Add success/fail class to parent depending on data returned */ },
                error: function(jqXHR, textStatus, errorThrown) { /* Add fail class to parent */ }
            });
        }
    });
});

This of course works, in theory, but in practice, all the AJAX requests being sent off pretty much at the same time wreaks havoc. The server (a bog-standard, low-end shared server) seems to handle about twenty or so more-or-less-simultaneous AJAX requests, and then it gives up and the rest of the calls fail with the textStatus “Error” and the errorThrown “Too many requests”.

So what I need is to fire off the AJAX requests one at a time, each iteration of the loop waiting until the previous AJAX call has completed before sending off another one—but also, preferably, without locking up the entire browser, as async: false would do. Enter callback hell, or at least the (to me) very inelegant solution of a recursive function called in the complete callback of the AJAX request.

Surely this must be an exceedingly common question to have, I thought to myself, and went searching for better alternatives. Sure enough, there is a plethora of StackOverflow questions on variations of this topic, and several plethorata1 of random Internet people offering various solutions to it as well.

The common denominator among the vast majority of these solutions is that using deferred objects and promises is The Way To Go. Very well—I’ve never really grokked or made use of deferred objects, so I figure this is as good a time as any to figure out just what exactly they are and how they work. This turns out to be trickier than expected, and I’m still not sure I’ve quite got it, which may be why I’m not getting this.

I’ve tried implementing at least five different variations of promise-based solutions found here and elsewhere now, and my problem is that none of them works. Partly because the vast majority are six or seven years old, and deferreds and promises have changed a lot since then in ways that I can’t quite keep track of.

At the moment, I have implemented a variation on this solution, essentially thus:

$(document).on('submit', "#sendform", function(e) {
    e.preventDefault();
    var $form = $(this);
    var promise = $.when({});

    $form.find('input.userCheckbox').each(function(i) {
        var $this = $(this);

        if (this.checked) {
            promise = promise.then(sendmail($this, $form));
        }
    });
});

function sendmail($this, $form) {
    var defer = $.Deferred();

    $.ajax({
        url: $form.prop('action'),
        type: $form.prop('method'),
        dataType: 'json',
        data: { /* Stuff here */ },
        beforeSend: function() {
            console.log($this.val() + " start: " + Date.now();
            /* + Add loading class to parent */
        },
        success: function(data, textStatus, jqXHR) { /* Add success/fail class to parent */ },
        error: function(jqXHR, textStatus, errorThrown) { /* Add fail class to parent */ },
        complete: function() {
            console.log($this.val() + " end: " + Date.now());
            defer.resolve();
        }
    });

    return defer.promise();
}

This works… exactly the same way that the code at the top works. In other words, it just shoots off all the AJAX requests basically at once. No waiting for the previous call in the promise queue to finish before sprinting ahead. I log the start and completion times for each AJAX request to the console just to check; and sure enough, this is what I get, trying with ten checkboxes:2

1007 start: 12:12:41.333
1008 start: 12:12:41.341
1009 start: 12:12:41.346
1010 start: 12:12:41.350
1011 start: 12:12:41.355
1012 start: 12:12:41.359
1013 start: 12:12:41.363
1014 start: 12:12:41.367
1015 start: 12:12:41.372
1016 start: 12:12:41.375
1007 end: 12:12:42.140
1008 end: 12:12:42.553
1010 end: 12:12:42.639
1009 end: 12:12:42.772
1011 end: 12:12:42.889
1013 end: 12:12:43.007
1015 end: 12:12:43.157
1016 end: 12:12:43.289
1012 end: 12:12:43.422
1014 end: 12:12:43.570

Each AJAX request takes about a second or so to complete, but the next one gets started immediately, within a few milliseconds.

I think I see why this is happening.

The sendform function returns a promise object; but this happens immediately, before the complete callback to the AJAX request has resolved the underlying deferred object. So it essentially makes no difference: there is no instruction to actually wait until the deferred object is resolved before adding the next call to the queue.

All the different approaches I’ve come across here on SO and elsewhere have had this in common—they essentially do nothing to solve the actual issue, as far as I can tell.

So the actual question:

Is there no way to make use of deferred/promise object queues to make sure that an AJAX request made in a loop iteration is not made until the request made in the previous iteration has completed?

Am I just completely missing something blindingly obvious, or misunderstanding how deferreds and promises work?

 


1 Yes, I know plethora is feminine, not neuter—indulge me.

2 Not actually using Date.now() since Unix timestamp milliseconds aren’t the most readable things in the world, but just a trivial wrapper function that I’ve left out here lest this answer get any longer than it already is.

Community
  • 1
  • 1
Janus Bahs Jacquet
  • 859
  • 1
  • 11
  • 27
  • *"but quickly generates a bunch of “Too many requests!” errors"* I'm curious, where are you getting those errors from? – T.J. Crowder May 12 '17 at 10:41
  • @T.J.Crowder Straight from the Apache server. They appear as the `textStatus` and `errorThrown` variables in the `error` callback when the AJAX requests fail. The first twenty or so calls go through, and then the rest of them fail. Edited for clarity. – Janus Bahs Jacquet May 12 '17 at 10:44
  • 1
    You are completely missing the fact that jQuery Ajax requests *already are promises*. You don't need to do any manual plumbing with `$.Defer()` at all. It's a mystery to me why you would want to send 100 Ajax requests, one for each checkbox, instead of sending one request which contains the state of 100 checkboxes. – Tomalak May 12 '17 at 10:50
  • @Tomalak I’m not missing that, actually. The extra deferred object here is just a product of the particular variation that I’ve currently got implemented. I’ve tried others that just returned the AJAX promise directly, which made no difference. I explain why I don’t want to send just one request with the state of all checkboxes: unless I’m missing something, doing so would make it impossible for me to give a visual cue on each individual row in the form on the page, which is essential. – Janus Bahs Jacquet May 12 '17 at 10:54
  • You send 100 requests in close succession that are not different enough in their payload or purpose to justify 100 separate requests. To transfer 100 items, you can send one request with an array of 100 items. There is more than one way to make a "too many requests" error go away, and sending the requests slower is not the most... scalable approach. – Tomalak May 12 '17 at 10:58
  • @Tomalak I agree with that—and if I could figure out a way to send everything in one request, but get individual success/error results back as they become available, that is obviously what I’d do. But I’ve failed quite spectacularly at doing so thus far. :-/ – Janus Bahs Jacquet May 12 '17 at 11:07
  • See my answer (which contains, at least at the moment, no code). This is a problem that needs to be solved beyond the implementation level. There are ways to establish a communication channel between client and server (Websockets come to mind, but also long-poll requests), but initially I would concentrate on the server (since all the client does is wait). – Tomalak May 12 '17 at 11:20

4 Answers4

1

So it sounds like you want to limit the number of concurrent outstanding requests.

Before answering that, some thoughts regarding other things to do instead:

  1. I suspect you'd be better served by "chunking" groups of checkboxes together, so you have (say) 10 or 20 requests rather than 100, and get back your results in blocks.

  2. It is possible to have a single request and then get back information incrementally from the server as that request is processed. It's been so long since I've had to do it that at the time an iframe was the best approach, but I'm sure things have moved on in the intervening ~17 years. Basically, the resource you're posting to writes each result to its response as that result is available, and flushes its output; the server sends that to the client without terminating the connection (because there's still more data to come), and the client can read that partial data and update. So it may be worth researching how to do that here in 2017 as opposed with a dodgy iframe. :-)

  3. Another approach is to send a single request with all the data, and have its response be purely that the request has been received. Then have the code poll periodically to get the results available as they become available.

With that out of the way, back to your actual question: :-)

From a Promise perspective, you can use my solution in this answer:

const items = /* ...1000 items... */;
const concurrencyLimit = 10;
const promise = Promise.all(items.reduce((promises, item, index) => {
    // What chain do we add it to?
    const chainNum = index % concurrencyLimit;
    let chain = promises[chainNum];
    if (!chain) {
        // New chain
        chain = promises[chainNum] = Promise.resolve();
    }
    // Add it
    promises[chainNum] = chain.then(_ => foo(item));
    return promises;
}, []));

Adapting that to your situation, I get something along these lines:

$(document).on('submit', "#sendform", function(e) {
    e.preventDefault();
    var $form = $(this);
    var promise = $.when({});

    // Get a true array for only the checked checkboxes
    var checkboxes = $form.find('input.userCheckbox:checked').get();
    var concurrencyLimit = 10;
    Promise.all(checkboxes.reduce(function(promises, checkbox, index) {
        // What chain do we add it to?
        var chainNum = index % concurrencyLimit;
        var chain = promises[chainNum];
        if (!chain) {
            // New chain
            chain = promises[chainNum] = Promise.resolve(); // Or $.Deferred().resolve().promise() would work, I think
        }
        // Add it
        promises[chainNum] = chain.then(function() {
            return sendmail($(checkbox), $form);
        });
        return promises;
    }, [])).then(function() {
        // Entire process is done
    });
});

With these small changes to sendmail:

function sendmail($this, $form) {
    return $.ajax({
        url: $form.prop('action'),
        type: $form.prop('method'),
        dataType: 'json',
        data: { /* Stuff here */ },
        beforeSend: function() {
            console.log($this.val() + " start: " + Date.now();
            /* + Add loading class to parent */
        },
        success: function(data, textStatus, jqXHR) { /* Add success/fail class to parent */ },
        error: function(jqXHR, textStatus, errorThrown) { /* Add fail class to parent */ },
        complete: function() {
            console.log($this.val() + " end: " + Date.now());
        }
    });
}

That's assuming a fairly up-to-date version of jQuery, in which the various issues with $.Deferred have been cleaned up and brought in line with the Promises A/+ spec.

We can take $.Deferred out of the equation entirely if you prefer, via these changes to sendmail:

function sendmail($this, $form) {
    return new Promise(function(resolve, reject) {
        $.ajax({
            url: $form.prop('action'),
            type: $form.prop('method'),
            dataType: 'json',
            data: { /* Stuff here */ },
            beforeSend: function() {
                console.log($this.val() + " start: " + Date.now();
                /* + Add loading class to parent */
            },
            success: function(data, textStatus, jqXHR) {
                /* Add success/fail class to parent */
                resolve();
            },
            error: function(jqXHR, textStatus, errorThrown) {
                /* Add fail class to parent */
                reject();
            },
            complete: function() {
                console.log($this.val() + " end: " + Date.now());
            }
        });
    });
}
Community
  • 1
  • 1
T.J. Crowder
  • 1,031,962
  • 187
  • 1,923
  • 1,875
  • “So it sounds like you want to limit the number of concurrent outstanding requests.” — I suppose I do. Really, being the lazy sod that I am, I want _it_ to limit and keep track of outstanding requests for me, but I suppose that’s asking a bit much. I’ll have to mull over this answer a bit (like I said, I’m quite new to promises, and I’ve only just gotten the hang of the jQuery approach—I’ve left off the pure JavaScript approach for now). – Janus Bahs Jacquet May 12 '17 at 11:01
  • I'm pretty sure calling `.promise();` on an jQuery Ajax object is redundant. It's a promise already. – Tomalak May 12 '17 at 11:01
  • @Tomalak: You're [quite right](http://api.jquery.com/jQuery.ajax/#jqXHR), fixed. (It's harmless, but pointless.) I'd mis-remembered that jqXHR was a combination of XMLHttpRequest+$.Deferred, but it's actually a combination of XMLHttpRequest+$.Promise. I tend to normalize away from `$.Deferred`/`$.Promise` as soon as I can and lose track of their details... – T.J. Crowder May 12 '17 at 11:04
  • 1
    @JanusBahsJacquet: The fundamental thing above is that we don't call `sendmail` 100 times up-front; we call it X times (10, in the above), and then chain further calls to it to the completion handlers of those X calls, and keep doing that until we're out of requests to make. – T.J. Crowder May 12 '17 at 11:06
1

An HTTP request is a slow thing. And it's a resource hog, on the server side and the client side, when any processing at all is involved. You want to make as few of them as possible, while making the server's response as fast as possible.

Throttling the requests is a method of last resort, not something to do casually. When one instance of your web app bogs down the server with too many requests, you have an architectural issue, not a processing issue.

You can't really fix bad architecture by changing the processing side. What your code currently does is:

  1. capture the submit event
  2. send out, say, 100 Ajax requests, one for each checked checkbox
  3. ???
  4. Profit

Clearly the second step is misguided, especially since the only justification seems to be "I want the user to see intermediary progress". That's a nice goal, but making them wait longer and hammering the server at the same time can't be the solution.

Making the 100 requests in sequence causes the user to wait even longer and only solves the server-hammering problem. So that's even worse a solution.

My advice would be to send the form as it would be sent - in one request - and work on the server side to make the processing of that request as fast as you possibly can. I'm pretty sure there is room for optimization there.

In a second step you can think about how to make the client more interactive during the wait. But keep in mind - sending one request and letting the server process it is the shortest wait you can get. Anything that sends more requests than one will of course be slower overall due to the added overhead.

Tomalak
  • 332,285
  • 67
  • 532
  • 628
  • You’re absolutely right. The thing is more psychological than hard numbers, though: we are generally perfectly happy to wait much longer _if we can see there’s progress_. Step 3 above (taking place on the server) does take quite a while, because there’s a lot going on there, including API calls to an external provider for each e-mail generated. If there happen to be 2,000 checkboxes, the entire thing will likely take at least 15 minutes—which is fine, as long as the user can see how far they are. But not if nothing happens (that they can see) for 15 minutes. – Janus Bahs Jacquet May 12 '17 at 11:20
  • If there happen to be 2000 checkboxes, you have an UX issue as well. :) – Tomalak May 12 '17 at 11:31
  • That’s an intended (and useful!) part of the UX design. Different types of e-mail have different groups of recipients, and we need to be able to be able to selectively add and remove individual recipients even to the very large groups. Having a table with 2,000 rows of people is of course not the easiest thing in the world for a user to manage, but it is the best compromise between the needs we have. (This is for an admin control panel, not an end-user interface.) – Janus Bahs Jacquet May 12 '17 at 11:36
  • There are client side libraries that encapsulate websockets to such a degree that they transparently provide you with a fall-back if a websocket connection can't be made for some reason. You could use a pub-sub library to abstract the idea of async communication with out-of band updates even further. But the core point is that nothing can make the server faster, except actually making the server faster. The main part of your effort should go there. – Tomalak May 12 '17 at 12:47
0

Try this, but only with jQuery 3+, because of changes to callback execution:

$(document).on('submit', "#sendform", function(e) {
    e.preventDefault();
    var $form = $(this);
    var promise = $.when();

    $form.find('input.userCheckbox').each(function(i) {
        var $this = $(this);

        if (this.checked) {
            promise = promise.then(function() {
                return sendmail($this, $form);
            }).then(function(data) {
                //success
            }, function() {
                // fail
            });
        }
    });
});

function sendmail($this, $form) {
    return $.ajax({
        url: $form.prop('action'),
        type: $form.prop('method'),
        dataType: 'json',
        data: { /* Stuff here */ },
        beforeSend: function() {
            console.log($this.val() + " start: " + Date.now();
            /* + Add loading class to parent */
        },
        complete: function() {
            console.log($this.val() + " end: " + Date.now());
        }
    });
}
Wizard
  • 2,961
  • 2
  • 13
  • 22
-1

Better yet: do one ajax request do PHP handler and handle an array on server side:

function sendMail(userCheckbox,$from)
{
    $.ajax({
        url: $form.prop('action'),
        type: $form.prop('method'),
        dataType: 'json',
        data: { array:userCheckbox /*and more data here*/ },
        beforeSend: function() {
            console.log($this.val() + " start: " + Date.now();
            /* + Add loading class to parent */
        },
        done: function(response) {
            /*check response and do something like if(response=='success')*/
        },
        complete: function() {
            console.log($this.val() + " end: " + Date.now());
        }
    });
}
$(document).on('submit', "#sendform", function(e) {
    e.preventDefault();
    var $form = $(this);
    var userCheckbox=[];
    $form.find('input.userCheckbox:checked').each(function(i) {
        var $this = $(this);
        if($this.val())
        userCheckbox.push($this.val());
    });
    if(userCheckbox.length>0) sendMail(userCheckbox,$form);
});
Pedro
  • 1
  • That doesn’t actually answer the question at all—it’s exactly what I’m trying to avoid, since it gives me no option of displaying the status of each individual checkbox as it becomes available. – Janus Bahs Jacquet Jul 13 '17 at 09:43