0

I am running a single webpage 24/7 on two computers. One computer is running IE8, the other IE9. On this webpage is a javascript timer which runs $.getJSON to retrieve a cross-domain JSON object. The $.getJSON works perfectly under normal conditions however there is the possibility that the internet connection on one of these computers will go down temporarily. Since I am using the $.getJSON to retrieve new content for the webpage, if the internet goes down momentarily, the old content will be shown.

My issue is that I assumed $.getJSON's fail event would fire if the internet went down when the $.getJSON was called. In this case, a new timer would be set to attempt to retrieve the JSON in X minutes (it will never stop trying). When testing this, I disabled my internet connection and yet the code within the fail event did not fire.

Will fail not be called in the case of no internet connection? If so, what would you recommend to prevent my JavaScript from stopping permanently when the internet goes down?

(One method I looked at was checking for an internet connection before the JSON call however I've read window.navigator.onLine is unreliable and I couldn't find any other solutions)

noahnu
  • 3,479
  • 2
  • 18
  • 40

1 Answers1

1

getJSON is just shorthand for ajax with a few values already set. And ajax has a timeout option. That is probably the most reliable option (but obviously won't respond to network failure immediately). I personally would combine it with any checks you can find for determining network status.

$.ajax({
    dataType: "json",
    url: url,
    data: data,
    success: success,
    timeout: 10000 // 10 seconds
});

http://api.jquery.com/jQuery.getJSON/

Dave
  • 44,275
  • 12
  • 65
  • 105
  • 1
    Actually, in your specific case, just set the timeout to whatever retry interval you want, and have the error handler retry the request (if the error was a timeout) directly, or with a `setTimeout('tryAgainFunction()',0)` (not sure how jQuery works internally, but without that you might get an ever-increasing memory footprint). – Dave Apr 13 '13 at 20:51
  • Why put `tryAgainFunction()` in a timeout? – noahnu Apr 13 '13 at 20:58
  • On the `JQuery` website it says this under the `error` function for `AJAX`: "This handler is not called for cross-domain script and cross-domain JSONP requests." This is true for `fail` as well? – noahnu Apr 13 '13 at 21:02
  • 1
    It's to do with variable scope. As I understand it, `function a(){var i;a()}` will create more and more `i` variables as it loops, but `function a(){var i;setTimeout('a()',0)}` will only have the last one (using quotes puts the called function into an `eval` environment which doesn't inherit scoped variables). `function a(){var i;setTimeout(a,0)}` *will* create more `i` variables. Whether you need it or not depends on how jQuery handles things internally. – Dave Apr 13 '13 at 21:02
  • 1
    @noahnu You're not using JSONP or script requests, so don't worry about that. – Dave Apr 13 '13 at 21:03
  • @noahnu calling something with `setTimeout(..., 0)` basically schedules it to be called as soon as javascript is idle. This allows the current function to return. If you call a method directly, the current function does not return until the called function returns, which makes the call stack deeper. – tcovo Apr 13 '13 at 21:03
  • 1
    In this case I don't think the `setTimeout(..., 0)` is necessary because the `error` handler will get called with a "fresh" call stack -- the call stack would not include the code that called `$.ajax()`. – tcovo Apr 13 '13 at 21:06
  • Works, thanks. Would you recommend I call the function again regardless of the nature of the error? – noahnu Apr 13 '13 at 21:32
  • Only if you delay it a few seconds, otherwise you could easily DDOS yourself. Also it's generally considered good practice to increase the wait time as the number of failures increases; first waits (maybe) 1 second, then 2, then 4, 8, etc. With some eventual cap (maybe at 2 minutes). – Dave Apr 13 '13 at 21:50