5

I used below ajax call to retrieve data from database and show it in my page.

$.ajax({
    type: "POST", url: "MyPage.aspx/LoadGrid",
    data: "{idyear:'2020'}",
    contentType: "application/json; charset=utf-8",
    dataType: "json",
    success: function (response) {
        $(".gridBody").html(response.d);
    },
    failure: function (response) {
        alert(response.d);
    }
});

Currently this operation returns 1026 records and takes aroud 12 seconds.

since this process is time consuming and records will be more in future I have to find an alternative way to load data faster.

So, I tried another approch. I decided to get total number of records at first. for example now i have 1026 records and if I want to load my data in 100 records boundles, I need to make 11 ajax calls simultanously and combine the results at then end of all ajax calls. I thought by applying this method I can start all calls together and I don't have to wait for ending a call to start a new one.

var pagesize = 100; 
getQty(function () {
    var pagesqty = Math.floor(qty / pagesize);
    if (qty % pagesize > 0) {
        pagesqty += 1;
    }
    var control = 0;
    for (var i = 0; i < pagesqty; i++) {
        $.ajax({
            type: "POST", url: "MyPage.aspx/LoadGridMtoN",
            data: "{idyear:'2020' , page:'" + i + "' , pagesize:'" + pagesize + "'}",
            contentType: "application/json; charset=utf-8",
            dataType: "json",
            success: function (response) {
                //alert(control+" succeed");
                eval("var str" + i + "='" + response.d + "'");
                control += 1;
                if (control == pagesqty) {
                    var str = "";
                    for (var i = 0; i < pagesqty; i++) {
                        eval("str += str" + i);
                    }
                    $(".gridBody").html(str);
                }
            },
            failure: function (response) {
                alert(response.d);
            }
        });
    }
});

but now I am getting time out error while executing ajax calls. does any one knows any bettere way?

P.S: I wanted to try web worker, but It seems that I can't use JQuery in web Workwer.

P.S: I dont want to use paging. I should load all data together.

Behnam
  • 1,039
  • 2
  • 14
  • 39
  • 6
    _"this operation returns 1026 records and takes aroud 12 seconds"_ - You might instead want to fix the server side. – Andreas Mar 21 '20 at 11:21
  • server side is ok. It has a lot of calculations and should join a lot of tables. – Behnam Mar 21 '20 at 11:29
  • 3
    I suggest that you check your database table indexes, 1026 is not big to make query slow, and also you can implement pagination to speed things up and in your ajax query you pass the page you are requesting. – ROOT Mar 21 '20 at 11:32
  • 1
    Please forget database and server side. my question is about splitting ajax calls. forget numbers consider I have 100000 rows not 1000. now whats your suggestion. – Behnam Mar 21 '20 at 11:46
  • @Behnam, is pagination an option? – ROOT Mar 21 '20 at 11:50
  • 1
    Adding a workaround to circumvent a problem on the server is not a good solution, hence... Fix the server first. – Andreas Mar 21 '20 at 11:52
  • _"I dont want to use paging"_ vs. `data: "{idyear:'2020' , page:'" + i + "' , pagesize:'" + pagesize + "'}"` ? – Andreas Mar 21 '20 at 11:53
  • Its just to load data faster I dont want to show records in differeent pages – Behnam Mar 21 '20 at 11:56
  • 1
    My server side code is optimal or not is not my question. Please just help we with splitting ajax calls. – Behnam Mar 21 '20 at 11:57
  • I seriously doubt it would be any faster if you split into multiple calls. More likely you will just swamp your backend with calls and it will end up taking the same or more time. Which is really proven by the timeout errors you say that you are getting... – fredrik Mar 21 '20 at 12:12

3 Answers3

4

Please note that simultaneously calling endpoints from client side (instead of a single call) has more overhead and is not a good solution. For example, your server should handle more connections from client, your DB should handle more connections from your ORM in back-end, your table and correspondingly your disk and IO is challenging etc...

By the way, by considering that all the parts of your system are perfectly designed, from UX point of view, and from technical point of view, incrementally loading the data is a good solution. I mean, loading the first 100 records, while user are using them loading the second 100 records and adding them to end of the list (or when the user scroll down).

However finally you have to consider pagination! You can't load 1M records on your webpage! And no one check and use all the records in the webpage. So, you had to limit the number of records fetched and use server side pagination or provide another approach to the users, to submit their request, and then you process the request and create the report and write it in a raw text file or an excel file and email him/her the download link.

Alireza Mahmoudi
  • 964
  • 8
  • 35
1

The answer you don't want to hear is that you need to fix your server-side. 1026 records really should not be taking 12 seconds, especially for data being retrieved for the client. Consider reducing the number of JOINs in your SQL statement and aggregating the data in server-side logic instead, and try running EXPLAIN against your queries and consider indices were appropriate.


To answer your question about splitting AJAX calls...

It looks like you have implemented pagination, so perhaps create an asynchronous recursive function that obtains 5-10 records at a time, incrementing the pageNum and recursing after each promise response. You can populate the page and the user will be seeing data without waiting so long. However, you must understand that this would increase the volume to your server, and it will probably end up taking longer to obtain all of the data. I feel the way you are trying to accomplish this goal to be in bad practice.

buttface64
  • 143
  • 5
1

Assuming you can make changes to the server side,

  1. create a new table with all the fields that you are going to need on the front end
  2. write a stored procedure to update this table on regular basis
  3. use this table in your ajax call to fetch the records.
  4. Use pagination. No one is going to use 1000+ records at a time
  5. give a search option at the top, in case you feel like the user must have access to all the records.
  6. As suggested in other answers, don't create multiple ajax calls. You will only end up regretting and creating a bottleneck for yourself in later stages
Irfan Harun
  • 979
  • 2
  • 16
  • 37