1

I am building an app that should be able to work offline. I am using jQuery Mobile, PHP, MySQL and jStorage in order to easily use HTML5 localStorage

I am trying to figure out the best way to download the data into the local device localStorage and using it at later stages without slowing down or crashing the browser.

I have around 5000 records (500 Kb of data) in a MySQL table, and I need the app to download all this data so that it can be used at a second stage while offline.

download_script.php returns all the records in JSON format e.g.

{"1":{"1":{"p_bar":"10.30","v_0":"0.0312207306000000","h_vap":"311.78","p_10c":"99.99"}},"2":{"1":{"p_bar":"10.40","v_0":"0.0309405941000000","h_vap":"311.29","p_10c":"0.00"}},

I was wondering if there is any way I could optimize the following script (e.g. in order to not hang the browser, and possibly to display the percentage of data that is being downloaded)

$.ajax({
  url: "download_script.php",
  cache: false,
  success: function(big_json_dump){
  $.jStorage.set('some_key', big_json_dump);
 }
});

Can this approach be optimized e.g. using radpidjson? How can I change it so to display a live percentage of data downloaded?

Martin
  • 1,066
  • 3
  • 20
  • 36
  • 1
    First, set the dataType to "text" so that jquery doesn't try to parse it, then simply assign it to a localstorage key. you can't really optimize it much past that other than making the data dump smaller. – Kevin B Jan 27 '14 at 22:01
  • 1
    You might be up the creek without a paddle in terms of using AJAX but here is an idea: http://stackoverflow.com/a/3360510/2191572 – MonkeyZeus Jan 27 '14 at 22:01
  • Maybe this too: http://stackoverflow.com/a/15405450/2191572 – MonkeyZeus Jan 27 '14 at 22:02
  • @MonkeyZeus ok I will use that XHR method that you have suggested, that will help for some browsers. But are you suggesting there might be better methods than using AJAX? – Martin Jan 28 '14 at 08:39
  • Depends on what your definition of "better" is. An AJAX call is merely an HTTP request to a web server so there is literally zero difference between visiting the URL yourself and sending an AJAX request to it. However, an HTTP request has "network overhead" with every request so depending on how far away your server is from the user or how fast/slow it's internet speed is then this HTTP request could take just as long to initiate as the initial visit to your site but thanks to things like DNS caching and such this is kind of minimized unless you are on 3G or something. – MonkeyZeus Jan 28 '14 at 13:39
  • 1
    If you were to do some sort of chunked download process then using websockets for 10-20 rapid requests is much better because websockets doesn't need to initiate the connection multiple times but it rather keeps a persistent connection to your server just like a pconnect for DBMS. In case you were wondering how this site is so fast with notifications and such it is because it uses websockets as well. This site is built and maintained by some of the best and brightest that look out for the website's best interest and they have chosen websockets. – MonkeyZeus Jan 28 '14 at 13:47
  • Are you running multiple exclusive scripts for local data storage? If so then you can think about doing some sort of `3/16 data files downloaded` counter rather than a percentage bar. From a UX experience standpoint I understand that people like seeing continuing progress so you could give some sort of spinner or barber-shop styled rotating bar or something for a placebo effect – MonkeyZeus Jan 28 '14 at 13:51
  • Maybe Flash or JAVA have something that could help but then you are depending on non-native technologies that people may or may not have installed. – MonkeyZeus Jan 28 '14 at 13:52
  • @MonkeyZeus Websockets is an interesting option, thanks for flagging that up. However the server doesn't support that so I guess I will have to stick to AJAX using XHR2 for getting some sort of feedback to display a loading percentage to the user. Yes, the script will have to download various other pieces of data and store in separate localStorage keys. My main concern (more than showing % to the user) is memory load in reading and writing values. I might use JSLINQ and see if it can help – Martin Jan 28 '14 at 15:53
  • Yeah I figured websockets might not "be in the cards" for you due to server configuration requirements but I am glad you appreciate the added info. In terms of memory load are you referring to the user's **[RAM](http://stackoverflow.com/questions/2936782/javascript-memory-limit)** or **[disk space](http://stackoverflow.com/questions/7267354/javascript-memory-and-html5-localstorage-limitations-on-smartphones)**? – MonkeyZeus Jan 28 '14 at 16:01
  • @MonkeyZeus I am just concerned about RAM and performances as the JSON will be added to localStorage and then each set of data parsed to search for a matching value. Anyway, I might be getting too paranoid and going Off Topic :) – Martin Jan 28 '14 at 18:53
  • 1
    I am not sure what kind of magic JSLINQ can work but it looks like it just simply an easy access method for your arrays/json by making it query-able. I wouldn't really worry about 500kb of data unless you were going to shove it into a single string variable. Wanna have some fun? Open up your task manager and find the memory usage section. Now navigate to a page where this code is live `` **insert evil laugh** – MonkeyZeus Jan 28 '14 at 19:22

2 Answers2

1

(You had a good discussion in the comments but the question is left unanswered...)

"in order to not hang the browser" you need to split the data into smaller parts and fetch every part separately (e.g. in a loop). Otherwise, parsing a big chunk of JSON can hang the browser for a few milliseconds. Rapidjson can't help there, because Rapidjson is C++ and the browser talks JavaScript.

"to display the percentage of data" you need to inform the browser's JavaScript about the total amount of parts first. Again, nothing to do with Rapidjson.

There's also JavaScript streaming JSON parsers (like Clarinet) which, theoretically, can parse the large JSON document in chunks. This gives you more control about the parsing, at the cost of much CPU and programming complexity. You'll probably need to introduce a Web Worker or an aritifial moments of inactivity to keep the browser responsive. But if you're using a Web Worker, you can do the standard JSON.parse in there: http://igorminar.github.io/webworker-json-perf/; http://blog.softwareispoetry.com/2013/05/using-web-workers-to-jsonparse-large.html

ArtemGr
  • 11,684
  • 3
  • 52
  • 85
1

Paginate the results and execute $.ajax in a loop:

var total_records = 5000; // get this number to the browser somehow.
var per_page = 75;
var total_pages = Math.floor(total_records / per_page);

for(var i = 0; i < total_pages; i++) {
    $.ajax({
      url: "download_script.php",
      cache: false,
      data: { page: i },
      success: function(big_json_dump){
      $.jStorage.set('some_key', big_json_dump);
     }
    });
}

To paginate in PHP / in your SQL statement:

$page = intval($_GET['page']);
$from = $page * 50;
SELECT * FROM my_table LIMIT $from, 50;
                             ^^     ^^ total records to show
                             record_cursor
Ryan
  • 14,392
  • 8
  • 62
  • 102