0

I was thinking about creating script that would do the following:

  1. Get all javascripts from JS directory used on server
  2. Combine all scripts to one - that would make only one request instead of multiple
  3. Minify combined script
  4. Cache the file

Let's say that the order in which the files need to be loaded is written in config file somewhere.

Now when I load myexamplepage.com I actually use jQuery, backbone, mootools, prototype and few other libraries, but instead of asking server for these multiple files, I call myexamplepage.com/js/getjs and what I get is combined and minified JS file. That way I eliminate those additional requests to server. And as I read on net about speeding up your website I found out that the more requests you make to server, the slower your web become.

Since I'm pretty new to programming world I know that many things that I think of already exists, I don't think that this is exception also.

So please list what you know that does exactly or similar to what I described.(please note that you don't need to use any kind of minifiers or third party software everytime you want your scripts to be changed, you keep original files structure, you only use class helper)

P.S. I think same method could be used for CSS files also.

I'm using PHP and Apache.

T.J. Crowder
  • 1,031,962
  • 187
  • 1,923
  • 1,875
Vytautas Butkus
  • 5,365
  • 6
  • 30
  • 45
  • 1
    What server-side technology are you using for your website? – Richard Ev Apr 19 '12 at 12:25
  • possible duplicate of [What do you use to minimize and compress JavaScript libraries?](http://stackoverflow.com/questions/599911/what-do-you-use-to-minimize-and-compress-javascript-libraries) – epascarello Apr 19 '12 at 12:28
  • @Richard Ev - at the time PHP, but I'm thinking about concept it self and I would like to read about any solution for any platform, what their approach is. – Vytautas Butkus Apr 19 '12 at 12:35
  • @epascarello you probbably didn't read the whole post, because I told that it's not about that. – Vytautas Butkus Apr 19 '12 at 12:36
  • Do you not understand the idea behind Make, Ant, Maven? They control the process of building files to push out. If you change one file, the make script takes care of it and changes only the files that are effected. Doing this at runtime will lead to bad performance, and unprimed caches. – epascarello Apr 19 '12 at 18:03

3 Answers3

2

Rather than having the server do this on-the-fly, I'd recommend doing it in advance: Just concatenate the scripts and run them through a non-destructive minifier, like jsmin or Google Closure Compiler in "simple" mode.

This also gives you the opportunity to put a version number on that file, and to give it a long cache life, so that users don't have to re-download it each time they come to the page. For example: Suppose the content of your page changes frequently enough that you set the cache headers on the page to say it expires every day. Naturally, your JavaScript doesn't change every day. So your page.html can include a file called all-my-js-v4.js which has a long cache life (like, a year). If you update your JavaScript, create a new all-in-one file called all-my-js-v5.js and update page.html to include that instead. The next time the user sees page.html, they'll request the updated file; but until then, they can use their cached copy.

If you really want to do this on-the-fly, if you're using apache, you could use mod_pagespeed.

T.J. Crowder
  • 1,031,962
  • 187
  • 1,923
  • 1,875
  • I understand what you are talking about and I know about 3rd party minfiers and combiners but if I decided to add another line to JS file I would need to do the same operation for all files - that takes some time, doesnt it? edit: I read a little bit about mod_pagespeed, seems quite interesting I will come to it later. But this wouldn't be really the same approach because that would require to install this mod on the server and I suppose if you are sitting behined shared server you couldn't achieve this. – Vytautas Butkus Apr 19 '12 at 12:38
  • @VytautasButkus: *"that takes some time, doesnt it?"* Not in any real way. As an experiment, I just ran 8M of (largely unrelated) JavaScript files through `jsmin`. Took a quarter of a second. Closure Compiler was slower, 34 seconds, but that's because it does a lot more work (and gets better results). And I'm guessing you have a lot less than 8M of JavaScript. (It does the jQuery 1.7.2 file in 6.5 seconds.) This is on my dual-core 2.4GHz machine, so not a hyper-charged monster at all. – T.J. Crowder Apr 19 '12 at 12:53
  • I wasn't talking about time software takes to compress file, I was talking about time you need to put all 5 or more files into software and then upload those files. And suppose you do dayly updates you need to do it everyday. Then how much time do you waste on that per week? Suppose you can skip entirely this process. – Vytautas Butkus Apr 19 '12 at 13:12
  • @VytautasButkus: If you have a script that triggers the process and increments a build number, the time spent is near zero. But no, I'm not aware of any solution that makes it *completely* automatic, including (say) updating `page.html` to refer to the newest version. (If your page is created via PHP, of course you can use PHP to do that part automatically by reading the version number from whatever file your script uses.) – T.J. Crowder Apr 19 '12 at 13:21
1

If you're using .NET, I can recommend Combres. It does combination and minification of JavaScript and CSS files.

Richard Ev
  • 52,939
  • 59
  • 191
  • 278
0

I know this is an old question, but you may be interested in this project: https://github.com/OpenNTF/JavascriptAggregator

Assuming you use AMD modules for your javascript, this project will create highly cacheable layers on demand. It has other features you may be interested in as well.

ddumont
  • 583
  • 3
  • 14