6

What are some general (not specific to LAMP, .NET, Ruby, mySql, etc) tactics and best practices to improve page loading speed?

I am looking for tips about caching, HTTP headers, external file minification (CSS, JS), etc.

And for good tools like Google PageSpeed and Yahoo YSlow.

A "ultimate resource" wiki style checklist of "things not to forget" (moderated and updated by all the wizards here on SO) is the end goal. So folks don't have to Google around endlessly for outdated blog posts on the subject. ;)

I hope the "subjective" mods go easy on me, I know this is a bit open ended. And similar questions have been asked here before. And this material overlaps with the domain of ServerFault and Webmasters too a bit. But there is no central "wiki" question that really covers this so I am hoping to start one. There are great questions like this that I refer to on SO all the time! Thanks

Community
  • 1
  • 1
thaddeusmt
  • 15,410
  • 9
  • 67
  • 67

2 Answers2

6
  • Caching of page content
  • Load javascript at the bottom of the page
  • Minify css (and javascript)
  • Css and javascript should be in their own [external] files
  • If possible combine all js or css files into one of each type (saves server requests)
  • Use Google's jQuery and jQuery UI loaders (as it's likely already cached on some computers)
  • Gzip compression
  • Images should be the same size as the width and height as in the markup (avoid resizing)
  • Use image sprites when appropriate (but don't over do it)
  • Proper use of HTML elements ie. using <H#> tags for headers
  • Avoid div-itis or the more popular now ul-itis)
  • Focus javascript selectors as much as possible ie. $('h1.title') is much quicker than $('.title')
RDL
  • 7,865
  • 3
  • 29
  • 32
  • How does using proper tags improve loading speed? And is the "div-itis" thing just about reducing the size of the DOM tree for faster loading and rendering? – thaddeusmt Jan 26 '11 at 18:14
  • Proper use of page element tags and focus jquery selectors can improve performance. (I added a new point) If everything is a div and use $('div.title') then there really is no benefit. – RDL Jan 26 '11 at 18:24
  • What he means is that CSS selectors should be as specific as possible (with jQuery). However, this is not always the case with CSS. Deep CSS selectors can slow down the rendering of the styles. For example, "html body .my-div ul li h1" would render faster if written as ".my-div h1" – Scott Greenfield Sep 02 '11 at 01:25
3

Make your dynamic content more static.

If you can render your public pages as static contents you'll help proxy, caches, reverse proxy, things like web application accelerators & DDOS preventing infrastructures.

This can be done in several ways. By handling the cache headers of course, but you can even think about real static pages with ajax queries to feed dynamic content, and certainly a mix between these two solutions, using the cache headers to make your main pages static for hours for most browsers and reverse proxys.

The static with ajax solution as a major drawback, SEO, bots will not see your dynamic content. You need a way to feed bots with this dynamic data (and a way to handle user accessing this data from search engines url, big hell). So the anti pattern is to have the real important SEO data in a static page, not in ajax dynamic content, and to limit ajax fancy user interactions to the user experience. But the user experience on a composite page can maybe be more dynamic than the search egine bots experience. I mean replace the latest new every hours for bots, every minute for users.

You need as well to prevent premature usage of session cookies. Most proxy cache will avoid caching any HTTP request containing a cookie (and this is the official specification of HTTP). The problem with this is often application having the login form on all pages, and which need an existing session on the POST of the login form. This can be fixed by separate login pages, or advanced redirects on the login POST. cookie handling in reverse proxy cache can as well be handled in modern proxy cache like Varnish with some configuration settings.

edit: One advanced usage of reverse proxy page can be really useful: ESI, for example with varnish-esi. You can put on your html render tags that the ESI reverse proxy will identify. ach of these identified regions can have different TTL -Time To Live- (let's say 1 day for the whole page, 10 min for a latest new block, 0 for the chat block). And the reverse proxy will make the requests in is own cache or to your backend to fill these blocks.

Since the web exists handling proxys and caches has always been the main technique to fool the user, thinking the web was fast.

regilero
  • 29,806
  • 6
  • 60
  • 99
  • 1
    Some interesting points. It's true, serving up a flat page is quicker than hitting the DB to generate all of the content. – thaddeusmt Jan 26 '11 at 22:56
  • especially if you do not serve it, the chain of reverse proxy caches between you and the final browser do it for you for a long time. – regilero Jan 26 '11 at 23:03