-1

I have a SQL table that consists of 16742 records. Using jquery and do a get and retrieve all the data. Then for each row I then do a check again for every row and I have my condition and if it matches save that row to another table using a post. This all works fine we worked out it takes 6 seconds go iterate through the table. Based on the number we calculated that the script would run for 27 hours.

Now when the application runs it always stops at 8190 rows that have saved to the other table.

Is seems like that the server times out. Is there something that can just keep the script running?

Thank you for your input here is the code i have upto now.

<script type="text/javascript" >
$.getJSON("/Home/GetAllAusPostCodes", function (data) {
  //  alert('All Data Collected')

    var firstlat
    var firstlng
    var origionalpostcode
    var auspostid
    var orsuburb

   $.each(data, function (index, item) {
     alert('yes');

    //    alert(item.ID + " " + item.Postcode + " " + item.Suburb + " " + item.latitude + " " + item.longitude);

        firstlat = item.latitude;
        firstlng = item.longitude;
        auspostid = item.ID
        origionalpostcode = item.Postcode
        orsuburb = item.Suburb

       // alert('lat: ' + firstlat + ' ' + 'lng: ' + firstlng);


        $.each(data, function (index, item) {

            var p1 = new LatLon(Geo.parseDMS(firstlat), Geo.parseDMS(firstlng));

         //   alert(p1);

            var p2 = new LatLon(Geo.parseDMS(item.latitude), Geo.parseDMS(item.longitude));

          //  alert(p2);

            if (p1.distanceTo(p2) > 0 && p1.distanceTo(p2) < 30) {


                var url = "/Home/SaveDistancePostCode?AusPostCodeID=" + auspostid + "&PostCode=" + item.Postcode + "&OrigionalSuburb=" + orsuburb + "&SuburbName=" + item.Suburb + "&lat=" + item.latitude + "&lng=" + item.longitude + "&state=" + item.State + "&distance=" + p1.distanceTo(p2);


            //    $.each(function (i) {

                var i

                    $.post(url, function (data) {

                     alert(i + 1);


                    });

              //  })

           }

       });

alert('complete')

    });

});

this current script still stops are 8190 records on a much better pc from my first tests. i will continue to get it working. tkz for your input.

Costas Aletrari
  • 387
  • 5
  • 22
  • 2
    if it is indeed the *server* times out, then it's not a jquery problem, but a server problem, and you should tag your question with the language you're using. – Joeytje50 Jun 15 '14 at 00:51
  • there is no timeout for the entire application only for a single http request. you probably just need some `try-catch` blocks – Fabricator Jun 15 '14 at 00:57

1 Answers1

1

There are four possible issues here and we can't yet tell which one it is for sure, but I'll address possible changes for all three:

  1. If the server is timing out because the client isn't processing the data fast enough and you can't find a faster way to process the data on the client, then you can only fix this with a server change to speed up it's request handling or lengthen its timeout or a structural change in how you make the data request (e.g. requesting smaller chunks of data that can be serviced by the server without hitting the timeout).

  2. If the ajax call itself is timing out because the server takes too long or because it takes too long to read all the data, then you can change the timeout time on the jQuery ajax call and see if that helps. jQuery has an ajax option called timeout that allows you to set what you want the client-side ajax request timeout to be. You can set it individually for a particular jQuery ajax call or set it globally for all jQuery ajax calls. See the jQuery ajax reference for details.

  3. If the browser is timing out because a single javascript operation is taking too long to process the server results, then you will have to restructure your client-side code to process the data in chunks and allow the JS engine to service other events during the process. This will allow the client operation to run essentially indefinitely without making the browser unhappy. Here are some references on how to do chunking: Processing very large array, jQuery ajax freezes UI on large response, Avoiding unresponsive script message.

  4. If you're doing a lot of consecutive POSTs to your server and that is what is leading to the issue, then it's a bit hard for us to know exactly where the issue lies (it could be some combination of the above three items), but the fastest way to improve performance of a lot of consecutive POST operations is to rearrange how you transfer the data to the server so that you send larger chunks of data rather than lots of smaller piece of data. The slowest way to transfer a lot of data is to do a request, wait for a response, do another request, wait for another response, etc... It's much faster to gather up a whole bunch of data and send a large piece of data all at once, wait for one response, do another large bunch of data, etc...

Or, you could have some combination of these three and need to make more than one change.

Community
  • 1
  • 1
jfriend00
  • 683,504
  • 96
  • 985
  • 979