11

The thing: I have a page, which has to display undetermined number of images, loaded through AJAX (using base64 encoding on the server-side) one by one.

var position = 'front';
while(GLOB_PROCEED_FETCH)
{
    getImageRequest(position);
}

function getImageRequest(position)
{
    GLOB_IMG_CURR++;
$.ajax({
        url: urlAJAX + 'scan=' + position,
        method: 'GET',
        async: false,
        success: function(data) {
            if ((data.status == 'empty') || (GLOB_IMG_CURR > GLOB_IMG_MAX))
            {
                GLOB_PROCEED_FETCH = false;
                return true;
            }
            else if (data.status == 'success')
            {
                renderImageData(data);
            }
        }
    });
}

The problem is that images (constructed with the renderImageData() function) are appended (all together) to the certain DIV only when all images are fetched. I mean, there is no any DOM manipulation possible until the loop is over.

I need to load and display images one by one because of possible huge number of images, so I can't stack them until they all will be fetched.

Aleksandr
  • 2,185
  • 2
  • 21
  • 29
  • You're starting a request in the while loop and this request starts another request if `data.status == 'success'` and no request when you're setting `GLOB_PROCEED_FETCH = false` which is the condition of the while loop. So what is the reason for the while loop at all? – Andreas Oct 12 '13 at 08:24
  • Fixed the initial code – Aleksandr Oct 12 '13 at 08:48

3 Answers3

33

Your best bet would be to restructure your code to use async ajax calls and launch the next call when the first one completes and so on. This will allow the page to redisplay between image fetches.

This will also give the browser a chance to breathe and take care of its other housekeeping and not think that maybe it's locked up or hung.

And, use async: 'false' is a bad idea. I see no reason why properly structured code couldn't use asynchronous ajax calls here and not hang the browser while you're fetching this data.

You could do it with asynchronous ajax like this:

function getAllImages(position, maxImages) {
    var imgCount = 0;

    function getNextImage() {
        $.ajax({
            url: urlAJAX + 'scan=' + position,
            method: 'GET',
            async: true,
            success: function(data) {
                if (data.status == "success" && imgCount <= maxImages) {
                    ++imgCount;
                    renderImageData(data);
                    getNextImage();
                }
            }
        });
    }
    getNextImage();
}

// no while loop is needed
// just call getAllImages() and pass it the 
// position and the maxImages you want to retrieve
getAllImages('front', 20);

Also, while this may look like recursion, it isn't really recursion because of the async nature of the ajax call. getNextImage() has actually completed before the next one is called so it isn't technically recursion.

jfriend00
  • 683,504
  • 96
  • 985
  • 979
  • while(GLOB_PROCEED_FETCH) { setTimeout(getImageRequest(position), 1); } - you mean to do requests in loop without recursive call in "success" handler? – Aleksandr Oct 12 '13 at 08:20
  • @AlexShumilov - I added a code example using async AJAX which is much better. – jfriend00 Oct 12 '13 at 08:24
  • I cannot set the "async: true", because number of desired AJAX calls is undetermined - if I set "async: true", it produces huge number of AJAX calls in the beginning, while GLOB_PROCEED_FETCH is not set. – Aleksandr Oct 12 '13 at 08:24
  • @AlexShumilov - please look at my code example. It doesn't use the while loop. It launches the next ajax call when the previous one succeeds and stops when it hits the max images or doesn't get success. It can work with `async: true`. – jfriend00 Oct 12 '13 at 08:26
  • Thanks you very much! There is still a problem - we cannot set the limit of calls - GLOB_IMG_MAX is extreme value, it will be hardly ever reached. If we set the flag, GLOB_PROCEED_FETCH, it won't be working properly - when server will respond "No images left", we will have a lot of AJAX calls already made. – Aleksandr Oct 12 '13 at 08:35
  • @AlexShumilov did you check Optimus' answer? He also did something for you –  Oct 12 '13 at 08:37
  • someone please explain why this doesn't amount to a memory leak. Doesn't the first callback stay in memory while the second executes, then those two for the third and so on? Maybe there's something I'm not getting about how callbacks are referenced... – robisrob Nov 05 '15 at 21:51
  • @robisrob - Each function call to `getNextImage()` creates an execution context for that function. That execution context will stay in memory as long as anything is running that can reach the execution context. That means it will stay in memory until the `success` handler runs on the particular ajax call that was started inside that execution context. But, once the ajax call finishes, there is nothing left that has any references to that execution context so it will be eligible for garbage collection and thus is not a leak. – jfriend00 Nov 05 '15 at 22:10
  • 1
    @robisrob - Also, relevant to this issue is that `getNextImage()` runs, starts the ajax call and finishes execution all in one quick moment. The ajax `success` handler is called some time LATER after that invocation of `getNextImage()` has already completed. Thus, when the next invocation of `getNextImage()` is called from the `success` handler, there is no stack frame build-up as the stack has already unwound from the prior call to `getNextImage()` finishing execution - this is not really classic recursion. So, there is no memory leak or build up with this construct. – jfriend00 Nov 05 '15 at 22:12
  • thanks, I think I get it now - each called back instance of `getNextImage() ` runs instantly upon callback at which point that reference is no longer needed so there's no recursion stacking up here – robisrob Nov 05 '15 at 23:38
-2

Wrong and wrong. Don't user timers, don't chain them. Look at jQuery Deferred / when, it has everything you need.

var imgara = [];
for (image in imglist) {
  imgara[] = ajax call
}
$.when.apply($, imgara).done(function() {
  // do something
}).fail(function() {
  // do something else
});
Chris Caviness
  • 586
  • 3
  • 10
  • If you read the OP's question carefully, this is NOT how to solve the OP's question. They say: **I need to load and display images one by one because of possible huge number of images, so I can't stack them until they all will be fetched.**. The OP also wants to append images, in order, as they are fetched (not waiting until everything is done and appending in order). Your suggestion does not meeting any of these requirements. – jfriend00 Nov 05 '15 at 23:54
  • Actually if you learn how to use the deferred properly, you can do exactly what the OP described and handle each image as it is retrieved. I'm thinking you must be a timer writer, hoping that the .5 seconds per image estimated will never fail. The code above was not an answer, because I don't answer "givz me the codez" questions, it was a sample to point him in the proper direction. – Chris Caviness Jun 14 '16 at 18:06
  • Chris is the problem with this site. – Alex Mar 21 '17 at 03:58
-3

Try using setInterval() function instead of while().

var fetch = setInterval(loadImage, 2000);

function loadImage(){
    position= new position; //Change variable position here.
    getImageRequest(position);
    if(!GLOB_PROCEED_FETCH){
          clearInterval(fetch);
    }
}
Optimus Prime
  • 6,817
  • 5
  • 32
  • 60
  • How does this ever stop? It will go on forever. It doesn't observe any of the stop conditions the OP's code has. Also, if you use async ajax (which doesn't lock up the browser), there is no reason to use a timer either. – jfriend00 Oct 12 '13 at 08:40
  • @jfriend00 Nice observation. Using the clearInterval, as added now. Thank you. – Optimus Prime Oct 12 '13 at 08:50
  • Thank you! It works, but somehow it still renders images strangely - it renders them by sets of two or three images. I mean in the very beginning ALL of the images were rendered in the very end, now they are rendered by sets of random number. – Aleksandr Oct 12 '13 at 09:02
  • Increasing your interval may provide your browser more time to breathe. But I don't think, rendering them by sets of random number would be any problem, its fine right? – Optimus Prime Oct 12 '13 at 09:05
  • Thank you again! Actually yeah, as far as it won't cause any problem with logic – Aleksandr Oct 12 '13 at 09:14