2

I have a script made especially for spotty internet connection which allows a form to submit once or multiple times by converting the form data to JSON (using JSON.stringify) and moving it's contents into a hidden <textarea> field on submission. For now, let's call this hidden field the dataCache.

When the form submits and is processed over to the dataCache, a function named processDataCache() is immediately called which tried to process/save the data with a jQuery $.ajax() call. If successful, the dataCache is cleared. If not, for example we have no internet connection, setInterval is called to retry the call to processDataCache() several seconds later.

In the meantime, if the form is submitted again, it adds the contents to the dataCache, separating any existing contents with a unique delimiter.

Eventually, once an internet connection becomes available again, processDataCache() handles the call and dataCache is cleared.

This works fine when online with one or mulitple items in the dataCache. BUT my current issue is that when the form goes offline and one or multiple items get added to the dataCache, $.ajax() almost always seems to fire twice when coming back online. I say almost because it happens about 75% of the time, but not every time. Interestingly enough, if I add a console.log to the success parameter of $.ajax(), it only displays once as if it were only called once, but the data gets processed twice regardless.

Following is the code that is having the issue. It includes a batch process as well which makes it a bit more complicated than I outlined which solves POST limits, but the underlying principle is the same as I mentioned above:

<textarea id="dataLogCacheFile" style="display:none;"></textarea>

<script>
var maxEventsForBatchProcess = 5; // set to highest amount POST will allow cross-browser, considering each event will likely be around 400 chars (e.g. 5 for ~2000 char limit)
var nRetryAddEventTime = 5000; // every 5 seconds - if batch fails (e.g. no connection), how long to wait until trying again
var nAddNextEventsTime = 1000; // every 1 seconds - once a batch completes, time to wait until next batch begins
var cacheDelim = "##--##"; // delimiter for events in cache, must be unique and be the same as what the nEventProcessURL expects
var processingCache = false;
var nEventProcessURL = "process_cache.php";
var eventProcessRepeater;

function saveEventToCache(xProcess){
    var nNewDataARRAY = $("#MyForm").serializeObject(); // pull in form data
    var nNewData = JSON.stringify(nNewDataARRAY);
    var nDataLogCache = $("#dataLogCacheFile").val(); // pull in existing cache
    var nDataLogCacheARRAY = [];
    if(nDataLogCache){ nDataLogCacheARRAY = nDataLogCache.split(cacheDelim); }
    if(nNewData){ nDataLogCacheARRAY.push(nNewData); } // add new event to cache
    // update cache field
    nDataLogCache = nDataLogCacheARRAY.join(cacheDelim);
    $("#dataLogCacheFile").val(nDataLogCache);
    if(xProcess){ processEventCache(); } // try to process it
    return false;
}

var nRetryCount = 0;
function processEventCache(){
    var nDataLogCache = $("#dataLogCacheFile").val(); // pull in existing cache
    if( nDataLogCache ){
        processingCache = true;
        nDataLogCacheARRAY = nDataLogCache.split(cacheDelim);
        var nEventsToProcessARRAY = [], nEventsToCacheARRAY = [];
        for(var i = 0; i < nDataLogCacheARRAY.length; i++) {
            if(i < maxEventsForBatchProcess){ nEventsToProcessARRAY.push( nDataLogCacheARRAY[i] ); } else { nEventsToCacheARRAY.push( nDataLogCacheARRAY[i] ); }
        }
        if(nEventsToProcessARRAY.length){
            var nEventsToProcess = nEventsToProcessARRAY.join(cacheDelim);
            var cacheProcessAjaxRequest = $.ajax({
                type: "POST", url: nEventProcessURL, data: "nEventCache="+nEventsToProcess,  
                error: function(msg,msg2,errorMsg){
                    cacheProcessAjaxRequest.abort();
                    nRetryCount++;
                    if(!eventProcessRepeater){
                        eventProcessRepeater = setInterval(processEventCache,nRetryAddEventTime);
                    }
                },
                success: function(nCurrDateTime) {
                    // update cache field
                    nDataLogCache = nEventsToCacheARRAY.join(cacheDelim);
                    $("#dataLogCacheFile").val(nDataLogCache);
                    // if there are more events to process, keep going, otherwise close up shop
                    if(nDataLogCache){
                        setTimeout(function(){ processEventCache(); },nAddNextEventsTime); // wait a bit before processing next batch of events
                    } else {
                        processingCache = false;
                    }
                }
            });
        } else {
            processingCache = false;
        }
        // Turn off any additional processing threads if they're no longer needed
        if(!processingCache && eventProcessRepeater){ clearInterval(eventProcessRepeater); eventProcessRepeater = ""; }
    } else {
        processingCache = false;
    }
}
</script>
Michael
  • 388
  • 6
  • 20

1 Answers1

0

Well, I found a way to solve the issue without actually figuring out what was going on, but oh well, sometimes you just need to move forward.

Following the selected answer on this thread I updated my code above to this (just showing the relevant lines):

var cacheProcessAjaxRequest = { abort:function(){} };
function processEventCache(){
    ...
    cacheProcessAjaxRequest.abort(); // abort any other pending processing
    cacheProcessAjaxRequest = $.ajax({
        ...
    });
    ...
}

I also removed the previous cacheProcessAjaxRequest.abort(); line under the error: parameter.

With this fix in place, the double-processing still happened, but far less often. To take things one step further, I put in a variable to track how many cache-specific processes were running and only allowed a call to $.ajax() if that number was zero. These two together seem to solve the issue completely...after about 50 tests, I didn't run into any more double-processing.

Michael
  • 388
  • 6
  • 20