0

I'm supposed to parse a very large JSON array in Javascipt. It looks like:

mydata = [
    {'a':5, 'b':7, ... },
    {'a':2, 'b':3, ... },
    .
    .
    .
    ]

Now the thing is, if I pass this entire object to my parsing function parseJSON(), then of course it works, but it blocks the tab's process for 30-40 seconds (in case of an array with 160000 objects).

During this entire process of requesting this JSON from a server and parsing it, I'm displaying a 'loading' gif to the user. Of course, after I call the parse function, the gif freezes too, leading to bad user experience. I guess there's no way to get around this time, is there a way to somehow (at least) keep the loading gif from freezing?

Something like calling parseJSON() on chunks of my JSON every few milliseconds? I'm unable to implement that though being a noob in javascript.

Thanks a lot, I'd really appreciate if you could help me out here.

user1265125
  • 2,608
  • 8
  • 42
  • 65
  • 3
    160 000 objects is really a lot. You should split it up on server side – Pinoniq Sep 12 '14 at 14:12
  • Have you checked this thread? http://stackoverflow.com/questions/1160137/execute-background-task-in-javascript – Tasos K. Sep 12 '14 at 14:15
  • Apply pagination on server side, add pagination numbers and next page etc in json, and then keep calling api on ajax success. – Rohit Awasthi Sep 12 '14 at 14:16
  • You could split the data and send the chunks at different points of time if that's applicable to your case. – KiaMorot Sep 12 '14 at 14:16
  • You can call your parseJSON using setTimeout with a minimum delay. Refer [this](http://stackoverflow.com/questions/9516900/how-can-i-create-an-asynchronous-function-on-javascript) SO post for some more info. – Thangadurai Sep 12 '14 at 14:17
  • Wouldn't Web Workers be useful for this case? (I've never used them myself, but they are meant to run intensive processes in the background of the main thread). More info at MDN https://developer.mozilla.org/en/docs/Web/Guide/Performance/Using_web_workers – Sebastien Daniel Sep 12 '14 at 14:21
  • Is there a reason you can't use the native `JSON.parse` function? – The Spooniest Sep 12 '14 at 14:23
  • You shouldn't try to fix this in JS: fix the code that is sending you this insanely large JSON string. This is a server-side problem, not a JS issue – Elias Van Ootegem Sep 12 '14 at 14:25

2 Answers2

1

You might want to check this link. It's about multithreading.

Basically :

var url = 'http://bigcontentprovider.com/hugejsonfile';

var f = '(function() {
            send = function(e) { 
                postMessage(e); 
                self.close();
            };
            importScripts("' + url + '?format=json&callback=send");
         })();';

var _blob = new Blob([f], { type: 'text/javascript' });

_worker = new Worker(window.URL.createObjectURL(_blob));
_worker.onmessage = function(e) { 
    //Do what you want with your JSON 
}
_worker.postMessage();

Haven't tried it myself to be honest...

EDIT about portability: Sebastien D. posted a comment with a link to mdn. I just added a ref to the compatibility section id.

Cyril Duchon-Doris
  • 12,964
  • 9
  • 77
  • 164
  • This is a really good suggestion, but perhaps you could expand on it a bit and add some warnings about portability? – nepeo Sep 12 '14 at 14:41
  • I don't enough on the subject to be able to expand too much. However I reposted a link with browser compatibility. – Cyril Duchon-Doris Sep 12 '14 at 14:49
  • Me either to be honest, I hadn't even heard of the feature until you posted it! It's quite an exciting api though. Given that I mostly work on mobile devices the lack of android support is depressing, guess it's going to be awhile before I get to try it out... – nepeo Sep 12 '14 at 14:57
1

I have never encountered a complete page lock down of 30-40 seconds, I'm almost impressed! Restructuring your data to be much smaller or splitting it into many files on the server side is the real answer. Do you actually need every little byte of the data?

Alternatively if you can't change the file @Cyrill_DD's answer of a worker thread will be able to able parse data for you and send it to your primary JS. This is not a perfect fix as you would guess though. Passing data between the 2 threads requires the information to be serialised and reinterpreted, so you could find a significant slow down when the data is passed between the threads and be back to square one again if you try to pass all the data across at once. Building a query system into your worker thread for requesting chunks of the data when you need them and using the message callback will prevent slow down from parsing on the main thread and allow you complete access to the data without loading it all into your main context.

I should add that worker threads are relatively new, main browser support is good but mobile is terrible... just a heads up!

nepeo
  • 509
  • 2
  • 9