-1

I use ajax $.get to read a file at local server. However, the web crashed since my file was too large(> 1GB). How can I solve the problem? If there's other solutions or alternatives?

$.get("./data/TRACKING_LOG/GENERAL_REPORT/" + file, function(data){
        console.log(data);
});
Tsung-Hsiang Wu
  • 81
  • 3
  • 10

2 Answers2

0

A solution, assuming that you don't have control over the report generator, would be to download the file in multiple smaller pieces, using range headers, process the piece, extract what's needed from it (I assume you'll be building some html components based on the report), and move to the next piece.

You can tweak the piece size until you find a reasonable value for it, a value that doesn't make the browser crash, but also doesn't result in a large number of http requests.

If you can control the report generator, you can configure it to generate multiple smaller reports instead of a huge one.

Community
  • 1
  • 1
Cristik
  • 30,989
  • 25
  • 91
  • 127
0

Split the file into a lot of files or give a set user ftp access. I doubt you'd want too many people downloading a gig each off your web server.

Seth T
  • 305
  • 1
  • 10