0

I'm working on the project that uses .json file as a main source of data. Application is developed in Bootstrap, JavaScript/jQuery and HTML5. This code was developed few months ago and I'm trying to improve efficiency and update the code.

The first thing I noticed after reviewing the code was the way the data is included in this application. There are few .json files that are used for different screens. These files are all over the place in different locations.

Also every time they do onclick for example they reload the .json file. There is no reason to do that since data is updated once a month. I'm wondering why this wouldn't be done only once (first time when application is loaded) and then set the data in js object?

Is that a good practice or there is something better? Here is example on how I'm thinking to update this code:

var jsonData = {};
$(document).ready(function() {
  $.getJSON('data.json', function(data){ 
    // Load JSON data in JS object.
    jsonData = data;
  });
});

Should the code above be placed in html header or body tags? I know that nowadays .js files are included on the bottom of body tag and all .css is in the header. Is there any difference when comes to including json files? If anyone have any suggestions please let me know. The json files have around 600+ records with multiple fields (over 30). That might change in the future. So if these files get bigger I need to make sure that won't affect efficiency of the application overall.

Rory McCrossan
  • 331,213
  • 40
  • 305
  • 339
espresso_coffee
  • 5,980
  • 11
  • 83
  • 193
  • You could store the data in a session cookie perhaps? Just have to ensure you are going to get the updated data when it is changed. This could potentially be done by cache-busting the string of the json file. – Zach M. Nov 30 '18 at 15:03
  • @ZachM. I never done that before with session cookies. Is that something that is better practice than js object? Are there any major difference between the two methods? Can you please provide some example? Thank you. – espresso_coffee Nov 30 '18 at 15:06
  • You would be storing the JSON in the session cookie: https://stackoverflow.com/questions/42020577/storing-objects-in-localstorage – Zach M. Nov 30 '18 at 15:07
  • 2
    @ZachM. localStorage is not a cookie. Using cookies makes no sense – charlietfl Nov 30 '18 at 15:08
  • Also, you may not want to assume the data will only be updated once a month. That practice could change. Google file cache busting and you will find examples. You will have a function that checks if the file is newer and update the local storage. This is just one way to solve this problem. – Zach M. Nov 30 '18 at 15:08
  • @charlietfl you are right, I miss spoke above and was thinking of local storage. – Zach M. Nov 30 '18 at 15:09
  • @ZachM. I have used local storage before for different purpose (for example saving form data) but not for this. Is there any reason why local storage is recommended more than js object? I will look more about Google file cache busting. That seems helpful. Thank you. – espresso_coffee Nov 30 '18 at 15:10
  • 2
    This quesiton is asking for opinions. – Taplar Nov 30 '18 at 15:10
  • by doing the ajax call you are not putting the json files in the same way as the script file. I think your approach is good and here is an an explanation why script is put last: https://stackoverflow.com/questions/436411/where-should-i-put-script-tags-in-html-markup – Bashar Ali Labadi Nov 30 '18 at 15:15
  • @BasharAliLabadi I'm not sure that I understand the answer. – espresso_coffee Nov 30 '18 at 15:16
  • putting is different than doing ajax call in the browser, the browser will block on script tags to load & parse. i mean your question "Is there any difference when comes to including json files?" is not valid unless you hard code the json inside script files and include them. but the snippet you are showing here is good. that's how front end calls backend for data let that be an api or file. – Bashar Ali Labadi Nov 30 '18 at 15:20

2 Answers2

1

In my view you are correct to think that files shouldn't be loaded by onclick event. I agree with you that you should load files beforehand.

The correct place to load is before any js code that uses them. JSs are placed in the bottom of the page because the DOM has to be already loaded in order for the JS code to work. So it's natural that you describe the page and then load the code that runs on it.

Also 600+ records even with 30 fields is a minimal amount of data that fits gently in memory. I would load all jsons beforehand and use them directly from a variable in memory. If you think this will grow a lot (by a lot I mean 100.000+ records), then I would use localstorage for that.

I'll give you another option though: in one of my systems I load to memory aprox. 25000 records in a full blown memory database and this happens in much less than 1s and a select to this database is imediate. You have full sql available. This could be a good aproach to you. I'm talking about SQLite compiled to javascript: https://github.com/kripken/sql.js/

I tested some memory databases and I recommed this one strongly.

Edit

Answering to @expresso_coffee:

I use the following code to import json to SQLite (I use requireJs):

define(['jquery', 'sqlite', 'json!data/data.json'],
function($, sqlite, jsonData) {

    self = {};

    var db;

    function createDb() {
        return new Promise((res)=>{
            db = new sqlite.Database();
            db.run("CREATE VIRTUAL TABLE usuarios USING fts4(field1 int, field2 text, field3 text, field4 text, field5 text, field6 text, field7 text);");
            res(1);
        })
    }

    function populateDB( jsonData ) {
        return new Promise((res)=>{
            var stmt = db.prepare("INSERT INTO table values (?,?,?,?,?,?,?)");
            db.run("BEGIN TRANSACTION");
            jsonData.list.forEach((rec)=>{
                stmt.run([rec.field1, rec.field2, rec.field3, rec.field4, rec.field5, rec.field6, rec.field7);
            })
            stmt.finalize;
            db.run("END");
            updateDOM();
            res(1);
        });
    }

    (...)

This is the code that loads the 25000 records in a split second.

Nelson Teixeira
  • 6,297
  • 5
  • 36
  • 73
  • Nelson, thanks for explaining and providing useful feedback. As of now we use Oracle database. I'm not sure if SQLite is something that can be easily implemented in the system or that would require major changes/updates. Also, would you recommend putting the `getJson` function in `.js` file and then included that in body tag as a first script tag? Or that should be in `head` on the top of html file as I showed in my example above. Thank you! – espresso_coffee Nov 30 '18 at 15:29
  • No JSs should be in head. All of them should be on bottom. The SQLite version I told you isn't a installable database in your system. It's a js COMPILED db. You just include one js file in your code and you're ready to use the full database in memory. So you don't need to install nothing besides this. – Nelson Teixeira Nov 30 '18 at 15:34
  • About if you should use it in a file, depends on how large the code that loads js is. The way I'm understanding, is that this code length is only a few lines. If that's the case you can just include it the js initialization code. – Nelson Teixeira Nov 30 '18 at 15:36
  • I have one more question about SQLite. Is that something that is supported in all browsers? Also is that a good fit for single page applications with frameworks like AngularJS or ReactJS? I started reading about that but just would like to make sure that I'm going int he right direction. This application is single page and I would potentially use Angular or React int he future to replace JQuery. – espresso_coffee Nov 30 '18 at 16:30
  • AFAIK yes it is. If you came across some bug in a browser please let me know. I didn't find any yet. – Nelson Teixeira Nov 30 '18 at 16:33
  • I have looked over github example you provided above. There is no example on how to import JSON file in sqlite. Do you have any example on how to do that? – espresso_coffee Nov 30 '18 at 17:39
1

One approach would be to store $.getJSON promises so you only ever make one request to server per url (when actually needed) and re-use the same promise for any future calls for same data

var getData = (function() {
  var baseUrl = 'http://jsonplaceholder.typicode.com/';
  var promises = {};

  function getData(url) {
    console.log(promises[url] ? 'Existing promise' : 'New request', ' URL ::', url)
    promises[url] = promises[url] || $.getJSON(baseUrl + url);
    return promises[url];
  }
  return getData;

})();


// do multiple requests    
getData('todos').then(function(res) { /* do something with results*/ })
getData('todos').then(function(res) {
  console.log(' Second request array length=',res.length)
})

getData('todos/1')
getData('todos/1')
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
charlietfl
  • 170,828
  • 13
  • 121
  • 150