1

I'm not sure if this will actually be possible, since load() is an asynchronous method, but I need some way to basically Load several little bits of pages, one at a time, get some data included in them via JavaScript, and then send that over via Ajax so I can put it on a database I made.

Basically I get this from my page, where all the links I'll be having to iterate through are located:

var digiList = $('.2u');
var link;
for(var i=0;i<digiList.length;i++){

    link = "http://www.digimon-heroes.com" + $(digiList).eq(i).find('map').children().attr('href');

So far so good.

Now, I'm going to have to load each link (only a specific div of the full page, not the whole thing) into a div I have somewhere around my page, so that I can get some data via JQuery:

 var contentURI= link + ' div.row:nth-child(2)';   

    $('#single').load('grabber.php?url='+ contentURI,function(){     
 ///////////// And I do a bunch of JQuery stuff here, and save stuff into an object 
 ///////////// Aaaand then I call up an ajax request.
     $.ajax({
      url: 'insertDigi.php',
      type: 'POST',
      data: {digimon: JSON.stringify(digimon)},
      dataType: 'json',
      success: function(msg){
        console.log(msg);
      }
 ////////This calls up a script that handles everything and makes an insert into my database.
    }); //END ajax
   }); //END load callback Function
  } //END 'for' Statement.
 alert('Inserted!'); 

Naturally, as would be expected, the loading takes too long, and the rest of the for statement just keeps going through, not really caring about letting the load finish up it's business, since the load is asynchronous. The alert('Inserted!'); is called before I even get the chance to load the very first page. This, in turn, means that I only get to load the stuff into my div before I can even treat it's information and send it over to my script.

So my question is: Is there some creative way to do this in such a manner that I could iterate through multiple links, load them, do my business with them, and be done with it? And if not, is there a synchronous alternative to load, that could produce roughly the same effect? I know that it would probably block up my page completely, but I'd be fine with it, since the page does not require any input from me.

Hopefully I explained everything with the necessary detail, and hopefully you guys can help me out with this. Thanks!

Miguel Guerreiro
  • 356
  • 1
  • 2
  • 11
  • No. You forgot to give the HTML. – Praveen Kumar Purushothaman May 09 '16 at 19:16
  • I don't think that would give context to my particular problem, in this case. I have separately tested everything on my script and it's all working fine. I can get the data and I can get it into my database. My problem isn't with the page itself, it's just to get the 'for' statement to wait for my .load() callback function to end before continuing and incrementing it's Iterator variable. – Miguel Guerreiro May 09 '16 at 19:19
  • do you actually want them to run one after another? or can they all run ascynchrously together? – wirey00 May 09 '16 at 19:27
  • I think you're looking for something like this? http://stackoverflow.com/a/21819961/1385672 – wirey00 May 09 '16 at 19:32
  • They could definitely run all at the same time for all I care. My problem would be that they need to be loaded into my page first, so that I can access the data I need via Javascript.That kinda limits me on doing it one at a time (Load page -> get data -> send data -> Load another page -> get data -> send data -> ...), I think? If there's an alternative to it then I'm all ears. – Miguel Guerreiro May 09 '16 at 19:35
  • @ᾠῗᵲᄐᶌ That might be it! I'm still a little confused at understanding how it works and how to work solution that around my code, but I'll re read that post a little more and try it on my script. Thanks! – Miguel Guerreiro May 09 '16 at 19:40
  • I'm about to head home - i'll put up an example when I get home if you can't figure it out – wirey00 May 09 '16 at 19:41
  • Ajax, and as such `load`, gets the entire file, any parsing and selecting of data would have to be done on the clientside. You can however create serverside scripts that only partially reads files and return the content etc. – adeneo May 09 '16 at 19:42

3 Answers3

1

You probably want a recursive function, that waits for one iteration, before going to the next iteration etc.

(function recursive(i) {
    var digiList = $('.2u');
    var link = digiList.eq(i).find('map').children().attr('href') + ' div.row:nth-child(2)';
    $.ajax({
        url: 'grabber.php',
        data: {
            url: link
        }
    }).done(function(data) {

        // do stuff with "data"

        $.ajax({
            url: 'insertDigi.php',
            type: 'POST',
            data: {
                digimon: digimon
            },
            dataType: 'json'
        }).done(function(msg) {
            console.log(msg);
            if (i < digiList.length) {
                recursive(++i); // do the next one ... when this is one is done
            }
        });
    });
})(0);
adeneo
  • 312,895
  • 29
  • 395
  • 388
  • After some minor tweaking, that worked like an absolute charm! And it prompted me to look up what recursive function was. Thank you so much! – Miguel Guerreiro May 09 '16 at 20:33
1

Just in case you want them to run together you can use closure to preserve each number in the loop

for (var i = 0; i < digiList.length; i++) {
    (function(num) { < // num here as the argument is actually i
        var link = "http://www.digimon-heroes.com" + $(digiList).eq(num).find('map').children().attr('href');
        var contentURI= link + ' div.row:nth-child(2)';   
        $('#single').load('grabber.php?url=' + contentURI, function() {
            ///////////// And I do a bunch of JQuery stuff here, and save stuff into an object 
            ///////////// Aaaand then I call up an ajax request.
            $.ajax({
                url: 'insertDigi.php',
                type: 'POST',
                data: {
                    digimon: JSON.stringify(digimon)
                },
                dataType: 'json',
                success: function(msg) {
                        console.log(msg);
                    }
                    ////////This calls up a script that handles everything and makes an insert into my database.
            }); //END ajax
        }); //END load callback Function
    })(i);// <-- pass in the number from the loop
}
wirey00
  • 33,517
  • 7
  • 54
  • 65
  • Unfortunately, as they all use .load() on a particular div, I can't use them all at once. It has to load one, get the data, and move to another one, rinse and repeat. Loading all of them at once would not allow me to get the data from the page loaded that I need c: Either way, this was already resolved and the project I was making with it is complete. Thank you for your suggestion though, and following through with posting the code once you got to your pc, appreciate it! Although I can't use it for this project, This is definitely something I could see myself using soon o: – Miguel Guerreiro May 17 '16 at 11:02
0

You can always use synchronous ajax, but there's no good reason for it.

If you know the number of documents you need to download (you can count them or just hardcode if it's constant), you could run some callback function on success and if everything is done, then proceed with logic that need all documents.

To make it even better you could just trigger an event (on document or any other object) when everything is downloaded (e.x. "downloads_done") and listen on this even to make what you need to make.

But all above is for case you need to do something when all is done. However I'm not sure if I understood your question correctly (just read this again).

If you want to download something -> do something with data -> download another thing -> do something again...

Then you can also use javascript waterfall (library or build your own) to make it simple and easy to use. On waterfall you define what should happen when async function is done, one by one.

sznowicki
  • 1,351
  • 5
  • 16
  • 33