2

I'm working on a large project which is extensible with modules. Every module can have it's own javascript file which may only be needed on one page, multiple pages that use this module or even all pages if it is a global extension.

Right now I'm combining all .js files into one file whenever they get updated or a new module get's installed. The client only has to load one "big" .js file but parse it for every page. Let's assume someone has installed a lot of modules and the .js file grows to 1MB-2MB. Does it make sense to continue this route or should I include every .js when it is needed.

This would result in maybe 10-15 http requests more for every page. At the same time the parsing time for the .js file would be reduced since I only need to load a small portion for every page. At the same time the browser wouldn't try to execute js code that isn't even required for the current page or even possible to execute.

Comparing both scenarios is rather difficult for me since I would have to rewrite a lot of code. Before I continue I would like to know if someone has encountered a similar problem and how he/she solved it. My biggest concern is that the parsing time of the js files grows too much. Usually network latency is the biggest concern but I've never had to deal with so many possible modules/extensions -> js files.

  • 3
    possible duplicate of [Most Efficient Multipage RequireJS and Almond setup](http://stackoverflow.com/questions/17035609/most-efficient-multipage-requirejs-and-almond-setup) – Benjamin Gruenbaum Aug 04 '13 at 17:21
  • Short answer - it highly depends on the duration of visit of users on your site. Your target audience, what the JavaScript code does and why they visit. Generally when building an app that _actually requires_ a whole megabyte of JS code it should be a single page application anyway. – Benjamin Gruenbaum Aug 04 '13 at 17:23
  • Take a look at http://stackoverflow.com/questions/1918996/how-can-i-load-my-own-js-module-with-goog-provide-and-goog-require, and https://developers.google.com/closure/library/docs/tutorial. – amdn Aug 04 '13 at 17:23
  • @BenjaminGruenbaum is right, but remember, do your own tests. Each case is different. – Frizi Aug 04 '13 at 17:24
  • 1
    it's a lot faster to download three 1/3mb files than one 1mb file, since browsers can pull at least 4 urls from your site at once. many can pull 8 at once, and each of those urls has an opportunity to be cached and never re-sent, but if you rebuild the big file, you'll have to ship the whole thing each modification. the old advice of bundling everything in one url has been outmoded by smarter browser pipelining and parallel yet internally deferred script tag fetching. – dandavis Aug 04 '13 at 20:16
  • Frankly... getting clever with pageload stuff is likely to introduce bugs - worse still, unpleasant, no-repro, network-timing-related ones. If you need to make the loading faster, there's no shortcut, you gotta profile it, not to mention test it. If it's not an issue (yet), do your users a favor and don't add complexity to "solve" an as-yet-hypothetical problem. – AdamKG Aug 04 '13 at 22:28

2 Answers2

1

If these 2 conditions are true, then it doesn't really matter which path you take as long as you do the Requirement (below).

Condition 1: The javascript files are being run inside of a standard browser, meaning they are not going to be run inside of an apple ios uiWebView app (html5 iphone/ipad app)

Condition 2: The initial page load time does not matter so much. In other words, this is more of a web application than a web page. So users login each day, stay logged in for a long time, do lots of stuffs...logout...come back the next day...

Requirement:

Put the javascript file(s), css files and all images under a /cache directory on the web server. Tell the web server to send the max-age of 1 year in the header (only for this directory and sub-dirs). Then once the browser downloads the file, it will never again waste a round trip to the web server asking if it has the most recent version.

Then you will need to implement javascript versioning, usually this is done by adding "?jsver=1" in the js include line. Then increment the version with each change.

Use chrome inspector and make sure this is setup correctly. After the first request, the browser never sends an Etag or asks the web server for the file again for 1 year. (hard reloads will download the file again...so test using links and a standard navigation path a user would normally take. Also watch the web server log to see what requests are being severed.

Good browsers will compile the javascript to machine code and the compiled code will sit in browser's cache waiting for execution. That's why Condition #1 is important. Today, the only browser which will not JIT compile js code is Apple's Safari inside of uiWebView which only happens if you are running html/js inside of an apple app (and the app is downloaded from the app store).

Hope this makes sense. I've done these things and have reduced network round trips considerably. Read up on Etags and how the browsers make round trips to determine if is using the current version of js/css/images.

On the other hand, if you're building a web site and you want to optimize for the first time visitor, then less is better. Only have the browser download what is absolutely needed for the first page view.

Brian McGinity
  • 5,777
  • 5
  • 36
  • 46
  • When this is setup correctly, your web server logs will show that the js file is only asked for 1 time. Also you will never have a '302 not modified' return code AGAIN!!!! – Brian McGinity Aug 05 '13 at 21:37
0

You really REALLY should be using on-demand JavaScript. Only load what 90% of users will use. For things most people won't use keep them separate and load them on demand. Also you should seriously reconsider what you're doing if you've got upwards of two megabytes of JavaScript after compression.

function ondemand(url,f,exe)
{
 if (eval('typeof ' + f)=='function') {eval(f+'();');}
 else
 {
  var h = document.getElementsByTagName('head')[0];
  var js = document.createElement('script');
  js.setAttribute('defer','defer');
  js.setAttribute('src','scripts/'+url+'.js');
  js.setAttribute('type',document.getElementsByTagName('script')[0].getAttribute('type'));
  h.appendChild(js);
  ondemand_poll(f,0,exe);
  h.appendChild(document.createTextNode('\n'));
 }
}


function ondemand_poll(f,i,exe)
{
 if (i<200) {setTimeout(function() {if (eval('typeof ' + f)=='function') {if (exe==1) {eval(f+'();');}} else {i++; ondemand_poll(f,i,exe);}},50);}
 else {alert('Error: could not load \''+f+'\', certain features on this page may not work as intended.\n\nReloading the page may correct the problem, if it does not check your internet connection.');}
}

Example usage: load example.js (first parameter), poll for the function example_init1() (second parameter) and 1 (third parameter) means execute that function once the polling finds it...

function example() {ondemand('example','example_init1',1);}
John
  • 1
  • 13
  • 98
  • 177
  • 2
    what's up with the eval? – dandavis Aug 04 '13 at 20:12
  • If you pass an undefined object it'll throw an error. If you really can't live with eval you could replace it with `try {} catch(e) {}`. I'm extremely strict about how I code, eval is not as evil as many people suggest it is, at least at the client and when used correctly. – John Aug 04 '13 at 20:26
  • i don't like try catch much, so i re-wrote it to work the same as yours without the nasty bits. only thing you give up is array-notation capability for the presence detection, but dot paths and globals still work. i hope you don't mind. (i was bored). – dandavis Aug 04 '13 at 22:20