1

I'm getting XML data via jQuery, but the XML I am using has 5,000+ pieces of data I want to get. When my page tries to get the data after a few seconds my page freezes and shows "This page is being slowed down" and then crashes.

Is there a way I can get the data without the page crashing? Maybe slow the process down so it doesn't handle so many processes at once? Here is the code I'm using to get the XML data:

$(document).ready(function() {
  $.ajax({
    type: "GET",
    url: "mymxl.xml",
    dataType: "xml",
    success: function(xml) {
      $(xml).find("programme").each(function() {
        $("xmlcontent").append($(this).find("title").text());
      });
    },
    error: function() {
      alert("An error occurred while processing XML file.");
    }
  });
});
Rory McCrossan
  • 331,213
  • 40
  • 305
  • 339
lanjes
  • 81
  • 1
  • 8
  • 1
    `I am using has 5,000+ pieces of data I want to get` for what possible reason do you need that much data? I'd suggest using filtering and paging instead. Any attempt to improve performance is just transferring the simple problem due to weight of data. – Rory McCrossan Aug 10 '18 at 09:45
  • its a epg tv guide and has lots of data – lanjes Aug 10 '18 at 09:46
  • 1
    All the more reason to use server-side paging then. – Rory McCrossan Aug 10 '18 at 09:47
  • 1
    @RoryMcCrossan Yeah, sorry - my bad. I didn't even consider `$(xml)` – Reinstate Monica Cellio Aug 10 '18 at 09:50
  • If you want to do this on frontend you should [read](https://developer.mozilla.org/en-US/docs/Web/API/Streams_API/Using_readable_streams#Reading_the_stream) the [files](https://gist.github.com/jfsiii/034152ecfa908cf66178) in [chunks](https://stackoverflow.com/questions/49828310/read-chunked-binary-response-with-fetch-api). – NonameSL Aug 10 '18 at 10:21

0 Answers0