3

I have a report system where the user clicks the report, the SQL is executed and the results are shown in a table. The problem I have is a single report brings back over 500,000 rows for the report. The system takes a while to retrieve the data via AJAX call, however as soon as that has finished the browser "hangs" whilst it is adding the HTML to the page.

My question is this, is there another way I can add the HTML to the page without causing the browser to hang?

var HTMLPRE = '<p class="text-center"><b id="lblReportStatus">Producing report...</b><br><small>Average time: ' + parseFloat($('#hidReportID').attr('data-avg')).toFixed(2) + ' seconds</small>' +
    '<br><small>Current time: <span id="divCurrentTimeTaken">0</span></small></p>';
window.CurrentTimer = 0;
$('#reportresults').html(HTMLPRE);
// start the timer
window.ProducingReportTimer = window.setInterval(function() {
    window.CurrentTimer++;
    $('#divCurrentTimeTaken').html(window.CurrentTimer + ' seconds');
}, 1000);

$.post(window.routes['ReportWriter.ProduceReport'], FormData, function(resp, status) {
    if (status === 'success') {
        if (resp.code == '200') {
            $('#lblReportStatus').text('Generating report...<br>Please note your browser may become un-responsive.  Please wait for a few minutes if this happens.');
            ProduceReportTable(resp);
        }
    } else {
        alert("Unable to produce report.  Please contact support with the below information:\r\nStatus code" + status);
    }
}).fail(function(err, status) {
    alert("Unable to produce report.  Please contact support with the below information:\r\n" + err);
});
function ProduceReportTable(resp){
    var ReportHTML = '<div class="row"><div class="col-xs-12"><button class="btn btn-primary" id="btnExportExcel"><i class="fa fa-file-excel"> Excel</i></a></div>' +
        '</div><div class="col-xs-12"><div class="table-responsive" style="overflow-x: auto;">' +
        '<table class="table-hover table-striped table" id="tblReport">' +
        '<thead><tr>';
    // loop through the headers first
    $(resp.headers).each(function (idx, head) {
        ReportHTML += '<th>' + head + '</th>';
    });

    ReportHTML += '</tr></thead><tbody>';
    // loop through the data

    $(resp.rows).each(function (idx, row) {
        ReportHTML += '<tr>';
        $.each(row, function() {
            ReportHTML += '<td>' + (this instanceof Window ? '' : this) + '</td>';
        });
        ReportHTML += '</tr>';
    });
    ReportHTML += '</tbody></table></div></div>';
    $('#reportresults').html(ReportHTML);
    window.clearInterval(window.ProducingReportTimer);
    /*
    $('#tblReport').dataTable({
        deferRender:    true,
        /*scrollY:        200,*//*
        scrollCollapse: true,
        scroller:       true
    });*/

    // enable the excel button
    $('#btnExportExcel').on('click', function(e){
        e.preventDefault();
        let TableSource = document.getElementById('tblReport');
        var today = new Date();
        var dd = today.getDate();
        var mm = today.getMonth()+1; //January is 0!
        var yyyy = today.getFullYear();
        if(dd<10) {
            dd = '0'+dd
        }
        if(mm<10) {
            mm = '0'+mm
        }
        today = yyyy + '-' + mm + '/' + dd;
        GenerateXLSX(TableSource, true, resp.ReportName + '-' + today + ".xlsx")
    });
}

Sorry if this has been answered elsewhere, I've had a search and have been unable to find anything.

Thatkookooguy
  • 6,669
  • 1
  • 29
  • 54
Neo
  • 2,305
  • 4
  • 36
  • 70
  • 14
    Could you introduce pagination into this solution? 500,000 rows on a single page is a few too many. – Peter Meadley Jun 11 '19 at 14:29
  • 4
    Or even infinite scrolling. Loading much more rows than ever visible at once is usually a bad idea. – baao Jun 11 '19 at 14:34
  • 7
    You don't even have to think about browser capabilities before realising that this is a bad idea. 500000 rows of anything in a single page is an unusable mess. A 1px movement of the scrollbar will move you ~500 entries. Useless. For the love of all that is good, work on a better UX by paging your results to something more sensible/useful to the user. There's no way that all of that information is relevant to what you are trying to communicate. – spender Jun 11 '19 at 14:35
  • Check out this article on DOM Recycling for infinite scroll: https://developers.google.com/web/updates/2016/07/infinite-scroller – Sean Jun 11 '19 at 14:35
  • @PeterMeadley Pagination would be difficult as I have no control over the result set. Unless you can think of a way of implementing pagination with the large dataset? – Neo Jun 11 '19 at 14:39
  • @Neo I'm assuming then that you are using an api call built by someone else. If that api call has parameters such as offset and limit, problem solved. Otherwise, if the records it retrieves don't change very often, I would suggest caching them somewhere you can control and accessing them with your own call, limiting the amount of results. – Peter Meadley Jun 11 '19 at 14:42
  • In short, you don't need this. – Robert Harvey Jun 11 '19 at 14:43
  • @PeterMeadley Sadly no :( I had considered if DataTables had a solution, or if a web worker would do the job (maybe if I batch update 1000 a time?) – Neo Jun 11 '19 at 14:45
  • Virtual scrolling is definitely something you should consider – Ayush Gupta Jun 11 '19 at 14:46
  • but it would be best if you dont need this – Ayush Gupta Jun 11 '19 at 14:47
  • @AyushGupta do you have any links I can look at? – Neo Jun 11 '19 at 14:48
  • 3
    For the users' sake, I would have to recommend replicating the datatable somewhere else where you DO have control, and limiting the results and using an offset. The implementation of this will depend on your specific circumstances. – Peter Meadley Jun 11 '19 at 14:49
  • @Neo are you working with pure JS or a framework? – Ayush Gupta Jun 11 '19 at 14:52
  • 1
    I think any program would hang if trying to show half a million rows of data - unless you have a super computer with a lot of memory – Pete Jun 11 '19 at 14:56
  • possible duplicate - https://stackoverflow.com/questions/2402953/javascript-data-grid-for-millions-of-rows – Naga Sai A Jun 11 '19 at 15:53
  • Just to follow on a bit, from a UX standpoint, anything more than AROUND 100, maybe 200, records in a table becomes not especially usable, as a general rule of thumb, and usually indicates a design that doesn't really address user needs properly (think things like search criteria or pre-filter criteria or even splitting the screen up into some structure like, maybe, tabs or something, to logically separate out types of records). Even if this is truly a reporting scenario it's still worth questioning the requirements a bit before charging ahead. – Frank W. Zammetti Jun 11 '19 at 22:09

1 Answers1

7

No. Comments have already exposed great answers; the gist of it is that returning that many rows is a bad idea from a performance point (nothing can render that much data properly), from a user point (no one can possibly work with that much data at once) and from a security point (you're risking a huge DoS attack to your app). The only use case I can possibly imagine for that is creating a report, and in that case you should export an Excel/PDF document instead of showing it in HTML.

You should implement pagination and filtering options to solve your problem. If you have to use JavaScript for that, use jquery Datatables and a proper AJAX endpoint on your app.

Léster
  • 1,177
  • 1
  • 17
  • 39