-1

I wish to make a synchronous call to a page that handles SQL insert of the words I'm posting. However since I have many chunks and SQL is not asynchronous I wish to process each ajax call/chunk after another

for (chunk = 1; chunk <= totalchunks; chunk++) {
    $.ajax({
    type: "POST",
    dataType: "json",
    url: "updateHandle.php", 
    data: {words:arr.slice(1000*(chunk-1),1000*chunk),push:getpush},
    success: function(){
        console.log('Items added');
    },
    error: function(){
        console.log('Errors happened');
    }
    });
}

the

async: false,

does not work for some reason. Each ajax call always goes to the error case and not the success case. So is there another solution to this issue that I've overlooked?

I thought about using a busy-waiting while-loop and use locks, but the setTimeout() function does not work as expected (probably some error on my part)

EDIT: The chunks are too big for one AJAX call hence serialization is needed. Also the amount of chunks may change from call to call, so flexibility is needed.

deFunc
  • 33
  • 4
  • What about XMLHttpRequest API? – Vitaliy Vinnychenko Jun 26 '17 at 10:21
  • @VitaliyVinnychenko huh? $.ajax is just a jQuery wrapper around XMLHttpRequest. Is there something else you're suggesting could be done using native XHR instead of jQuery? – ADyson Jun 26 '17 at 10:22
  • 3
    "Each ajax call always goes to the error case and not the success case." — So debug it? Pay attention to the arguments passed to the error function. Look at the Console for errors. Look at the Network tab to see what the request and response actually look like. – Quentin Jun 26 '17 at 10:22
  • 1
    @Igor — That doesn't seem remotely relevant to the problem described here. – Quentin Jun 26 '17 at 10:23
  • Synchronous AJAX requests are deprecated. Take a look at [this](https://stackoverflow.com/questions/29282869/synchronous-xhr-deprecation). – Vitaliy Vinnychenko Jun 26 '17 at 10:23
  • "Each ajax call always goes to the error case and not the success case." That doesn't necessarily relate to the `async:false`, although using that is a bad idea - it's deprecated in some browsers and causes horrible user experience. You can chain $.ajax calls though to ensure they execute in sequence, because $.ajax returns a promise. e.g. `$.ajax(...).then(function(response) { //...trigger the next call here });`. You can look up more about promises/deferred objects online. But going to the "error" callback could be a symptom of some other failure. Like Question said check for console errors – ADyson Jun 26 '17 at 10:25
  • @Quentin - how so? Although the op does not understand why their code does not work the solution is not to make the javascript call synchronous and the only reason they might be thinking that is because they are not sure why its not working or because they do not understand how to make consecutive ordered asynchronous calls. – Igor Jun 26 '17 at 10:26
  • As you've discovered, using setTimeout won't work because you can't guarantee how long the ajax request will take to respond, it could easily exceed the timeout due to slow network etc. Also, one more thing - consider whether the size of your request is really so huge that it needs chunking up. Couldn't you just submit it all at once? – ADyson Jun 26 '17 at 10:27
  • @Igor — Changing the code to make sequential requests in a better way isn't going to help a problem where the requests themselves are erroring. – Quentin Jun 26 '17 at 10:27
  • This is a bad approach and will bring you headaches when one of the requests fails due of failure at the server. If it fails, you keep continuing your loop, so the next chunk is being sent while it shouldn't ... You have to redesign it with the use of async ajax and collecting all chunks first *before* inserting it to the database. – KarelG Jun 26 '17 at 10:28
  • @Quentin - maybe, they only way to know is to actually debug the code but the request(s) *probably* need to be sent in a specific order. The code above would fire them one after the other but there is no waiting and guarantee that they would be handled by the server is that same sequence they were sent which *might* cause the error. – Igor Jun 26 '17 at 10:29
  • I will probably tinker with the $.ajax(...).then(function...); suggestion from ADyson, didn't really think about that. Thanks for all the answers! :) – deFunc Jun 26 '17 at 10:30
  • @Igor — They said they were using `async: false` so it would be in order. – Quentin Jun 26 '17 at 10:31
  • @Quentin - `async: false does not work for some reason...` - I read that as the setting has no affect not so much as "async flag works but the issue persists". – Igor Jun 26 '17 at 10:35
  • @Igor — There's no reason for it not to work. While browser have deprecated it, they still support it because they have to be backwards compatible. The issue must be unrelated the the ordering. – Quentin Jun 26 '17 at 10:38

1 Answers1

0

How about something like:

function insertWords(words){
    if(words.length) {
        $.ajax({
            type: "POST",
            dataType: "json",
            url: "updateHandle.php", 
            data: {
                words: words.splice(0, 1000),
                push: getpush
            },
            success: function(){
                console.log('1000 items added');
                insertWords(words);
            },
            error: function(){
                console.log('Errors happened');
                return;
            }
        });
    } else {
        console.log("Done inserting all words.")
    }
}

You can use a callback as a param of insertWords or return a promise to make it even more resilient.

AndreiC
  • 542
  • 4
  • 16