3

I have a single json array which contains server logging data. Using flot I am plotting the graph, I have successfully done with 40+ thousand json objects.

I observed there were 581901 elements in and array. When I try to parse the browser is crashing. the size of json data is approx 55MB. Browser used is FF. Any efficient way to do this?

Sample JSON out of 581901 lines

{"date":"30-May-2016:00:00:00","url":"retriveImage","status":"200","data":"7480"}

I am doing this:

for(var i=0;i<da.length;i++){
      var ts = moment(da[i].date,"D-MMM-YYYY");var ts1 = moment("30-May-2016","D-MMM-YYYY");
      if(ts.isSame(ts1)){
      var hour = parseInt(moment(da[i].date,"D-MMM-YYYY H:mm:ss").format("H"));
                var code =parseInt(da[i].status);
                if(code<500){
                    ct200++;
                }else{
                    if(code==503){
                        e503++;
                    }else{
                        eoth++;
                    }
                    ct503++;
                }
        }
        two.push([hour,ct200]);
        five.push([hour,ct503]);
      }
Aadam
  • 1,521
  • 9
  • 30
  • 60

1 Answers1

0

When parsing a JSON file, or an XML file for that matter, you have two options. You can read the file entirely in an in-memory data structure (a tree model), which allows for easy random access to all the data. Or you can process the file in a streaming manner. In this case, either the parser can be in control by pushing out events (as is the case with XML SAX parsers) or the application can pull the events from the parser. The first has the advantage that it’s easy to chain multiple processors but it’s quite hard to implement. The second has the advantage that it’s rather easy to program and that you can stop parsing when you have what you need.

Have a look at this:

https://www.ngdata.com/parsing-a-large-json-file-efficiently-and-easily/

Yash P Shah
  • 779
  • 11
  • 15