0

I am working on an application that fetches many records from a database and renders them to a table. This is done with an AJAX call and appends them to the end of the results that are already there.

The number of records is the variable in this case and could be 10 or 20,000 depending on what the search criteria returns.

Now, first things first. I understand that having 20,000 records on the page isn't smart but this was a requirement to allow all the records be seen, modified, or marked up from this page.

The AJAX call to the PHP page is fine and seems to work pretty quick. The bottle neck however is appending that data to the DOM once its received via AJAX.

In the code below, you will see the main function that creates the table row and returns it back to the AJAX call.

/*
  Create Table Rows
*/
function createTableRows($dashboardID, $type){

  // Timer
  $start = microtime(true);

  // Dashboard
  $dashboardID = $_REQUEST['dashboardID'];
  $objDB = new DB;
  $objData = $objDB
    -> setStoredProc('RenderDashboard')
    -> setParam('dashboardID', $dashboardID)
    -> setParam('limit', $_REQUEST['limit'])
    -> setParam('offset', $_REQUEST['offset'])
    -> setParam('actor', '1234')
    -> execStoredProc()
    -> parseXML();

    // Fetch other data
    $markupData = fetchMarkup($dashboardID);
    $exportFields = fetchExportFields($dashboardID);
    $ignore = Array('identifierQID', 'identifierNTID');

    // Vars
    $outputArray = Array();
    $recordCount = 0;
    $i = 0;

    // Loop over our data
    foreach($objData->data as $r){

      $outputArray[$i++] = '<tr data-qid="'.$r->identifierQID.'" class="primaryValue ' . searchMarkup($markupData, $r->identifierQID) . '">';

        // Loop over our fields
        foreach($r as $key => $value){

          // Vars
          $fieldID = str_replace('_', '', $key);

          // Don't include our identifier columns
          if(!in_array($fieldID, $ignore)){
            $outputArray[$i++] = '<td data-tableexport-display="always" class="small' . ($exportFields ? (in_array($fieldID, $exportFields) ? ' hidden' : '') : '') . '">' . formatFieldData($fieldID, $value) . '</td>';
          }

        }

        // Notes always come last
        $outputArray[$i++] = '<td data-tableexport-display="always" class="notesTD allowContext hidden"></td>';

      $outputArray[$i++] = '</tr>';
      $recordCount++;

    }

    // Join our rows array and return it
    $end = microtime(true);
    $timer = number_format($end - $start, 2);
    return array(join("",$outputArray), $recordCount, $timer);
}

// This is what gets passed back to our AJAX call on the UI
echo createTableRows($dashboardID)[0];

Here is the Javascript that processes the response it receives back.

// Given data, create our table rows
function createRows(data) {

    // Update Progress Bar
    $('[name=progressDiv]').show();

    // Append the results to the DOM
    /* THIS IS WHAT IS KILLING THE SPEED!!!!*/
    $('[name=resultsTable]').append(data);

    // If our total number of records exceeds the threshold, we will be using the progress bar for the status
    if (totalRecords > maxThreshold) {
        $('[name=resultsProgress]').attr('aria-valuenow', currentPage / totalPages * 100)
            .css('width', currentPage / totalPages * 100 + '%')
            .text((currentPage < totalPages ? recordsPerPage * currentPage + ' of ' + totalRecords + ' records loaded' : 'Loaded ' + totalRecords + ' records!'));
    } else {
        // Loaded all records in one shot, update progress bar
        $('[name=resultsProgress]').attr('aria-valuenow', 100)
            .css('width', '100%')
            .text('Loaded ' + totalRecords + ' records!')
            .removeClass('active');
    }
    // Do we have more data to load?
    if (currentPage < totalPages && totalRecords > maxThreshold) {
      // Allow a little time for the progress to update before locking up
      setTimeout(function(){
        fetchMore();
      }, 100);

    }

    // After the table has been appended to the DOM, run clean up to enable any additional functionality
    cleanUp();
}

The Issue:

The problem is that the APPEND is locking up the browser and causing it to be unresponsive until the append has been completed. I already have this broken up so it will fetch the data in batches but that isn't the issue, its handling the rows in the response.

The Question:

Is there a way to process the results in batches and append that without it locking up the browser? The response it self is just a bunch of TR's that are appended to the TBODY of my table.

My last resort is having to page the results. If I can fix this bottle neck, I can convince them to do the paging for larger data sets.

I guess I am looking for a way to either return the results in a better format for appending or break up the response and have it append in batches while another AJAX call is fetching more data to process.

Thoughts?

SBB
  • 8,560
  • 30
  • 108
  • 223
  • 1
    appending a large number of rows to a table will always be killer. – epascarello Dec 27 '16 at 15:12
  • window.requestAnimationFrame(fetchMore); – Jonas Wilms Dec 27 '16 at 15:13
  • Can you wait to load more as a user scrolls? – Asking Questions Dec 27 '16 at 15:14
  • What about using Infinite scroll? – Vitalii Strimbanu Dec 27 '16 at 15:15
  • Infinite scroll is an option but what if a user is looking for a specific result in the response to modify. They will have to scroll until they find it because searching the results that are already appended wouldn't find anything if it hasn't been loaded to the DOM. I also use my download function to create an excel document of the table/rows on the page so I would have to somehow trigger it to push out all the results anyway if they click download so that all records could be included. – SBB Dec 27 '16 at 15:18
  • It's been a while I worked on AJAX-PHP combination but I'm sure you might be missusing AJAX.... **Asynchronous** JavaScript and XML. You should be considering that and do all you can to actually use object like `XMLHttpRequest` the right way. When you dont make things Asynchronous, they are synchronous and thus you'll have to wait for the 20,000 linesto load first....Read this -> http://stackoverflow.com/questions/8567114/how-to-make-an-ajax-call-without-jquery – Jason Krs Dec 27 '16 at 15:18
  • Another thing to read: https://www.sitepoint.com/javascript-large-data-processing/ – Jonas Wilms Dec 27 '16 at 15:20
  • @JasonKrs - ASYNC is default to true on $.AJAX calls, even though its not passed, unless I am mistaken at what you are referring to? – SBB Dec 27 '16 at 15:24
  • @Jonasw - I saw that article during some research and I wasn't quite sure how to apply it to my situation since I am receiving an HTML block of data back instead of an Array that could be broken up using that code. – SBB Dec 27 '16 at 15:28
  • @VitaliiStrimbanu - Here is my scenario. This page loads employees from our database as the results. If the search criteria is "Show me everyone that reports to this manager" we could have a large result set. In that search, our user may be looking specifically for "Bob" but he is result number 200 and we are showing 20 results per "scroll". In turn, this user would have to scroll and load data 10 times in order to just see the record they are looking for. I currently have a search box on the page that looks through the records on the DOM but that wouldn't work for data that's not appended. – SBB Dec 27 '16 at 15:32

2 Answers2

1

You can use a Javascript table component to handle displaying of large sets of JSON data. For example : https://clusterize.js.org/

Nadir Latif
  • 3,690
  • 1
  • 15
  • 24
0

May push it to the browsers qeue:

function createRows(data){
JSON.parse(data);
for(elem in data){
(function(){
 var el=elem;
 setTimeout(function(){
  document.body.innerHTML+="<tr>"+el+"</tr>";
 },0);
 })();
}
}

requires data to be a json array, may just return an array of strings, that can be handled:

["<td>User 1</td>","<td>User 1</td>","<td>User 1</td>"]

Solution two: Completely generate on server side:

<progress value="0" max="100" id="prog">
<iframe id="data" iframe src="yourdata">
</iframe>

$("#data").on("load",function(){
  $("#prog").val(100);
 });
Jonas Wilms
  • 132,000
  • 20
  • 149
  • 151
  • That is what is happening. The first call does X amount and then it checks to see if there is more to load. There is a `fetchMore` function in the PHP code that handles this. All it ends up doing is passing that data back to the JavaScript to append it again, the same boat I am in now. – SBB Dec 27 '16 at 15:25
  • the Iframe is an interesting concept but would still require me to the the data on the page in batches. Loading a huge amount of data would just cause the iFrame to be a white page until the PHP has loaded every record instead of doing it in batches. The JavaScript in that window would then freeze the main results the user needs to interact with, just like it is currently? – SBB Dec 27 '16 at 15:45
  • @SBB yep. i think the first solution is what you want – Jonas Wilms Dec 27 '16 at 15:48
  • I will give this a shot. Only issue is I will have to pass the `TR`'s with the data since there is custom css and markup that is added to it when generating the data. Not sure if that will be an issue – SBB Dec 27 '16 at 15:51
  • it shouldnt, may add it inline. Also the iframe should work if you use ob_flush() in PHP.. – Jonas Wilms Dec 27 '16 at 15:53