Looking at the page loads using the Chrome Network tab in Developer Tools, I receive the following timings at the following link:
http://www.seed-city.com/barneys-farm-seeds/liberty-haze
102 requests | 131.24KB | 8.41s (onload: 8.50s, DOMContentLoaded: 6.61s)
102 requests | 131.47KB | 8.47s (onload: 8.56s, DOMContentLoaded: 6.82s)
102 requests | 131.45KB | 8.71s (onload: 8.79s, DOMContentLoaded: 7.06s)
Now, making one (somewhat simple) change:
103 requests | 110.16KB | 3.77s (onload: 3.86s, DOMContentLoaded: 2.03s)
103 requests | 110.17KB | 3.85s (onload: 3.94s, DOMContentLoaded: 2.21s)
103 requests | 110.16KB | 4.20s (onload: 3.50s, DOMContentLoaded: 2.28s)
Here is a typical HAR log outputted from the Network tab for your current page:
http://pastebin.com/pM5Dd1Fw
The important part is to realize the browser is waiting nearly five seconds to even do anything (snipped only to show the relevant lines):
{
"startedDateTime": "2012-09-29T04:58:07.861Z",
"time": 4831,
"request": {
"method": "GET",
"url": "http://www.seed-city.com/barneys-farm-seeds/liberty-haze",
"httpVersion": "HTTP/1.1",
...
},
"cache": {},
"timings": {
"blocked": 0,
"dns": 16,
"connect": 129,
"send": 0,
"wait": 4677, // <<< Right here
"receive": 8,
"ssl": -1
}
}
And the timing for the slow.html
page:
{
"startedDateTime": "2012-09-29T05:04:51.624Z",
"time": 92,
"request": {
"method": "GET",
"url": "http://example.com/t/slow.html",
"httpVersion": "HTTP/1.1",
...
},
"timings": {
"blocked": 0,
"dns": 7,
"connect": 38,
"send": 0,
"wait": 44, // <<< Right here
"receive": 1,
"ssl": -1
},
"pageref": "page_3"
}
That's 4677ms
vs 44ms
. Here is a typical HAR log for that updated page:
http://pastebin.com/Jgm1YyU3
So what did I do to achieve those (very real and tangible) improvements? I moved the Javascript to the bottom of the body
tag:
http://pastebin.com/H9ajG99H
This makes a two-fold improvement. First, you're allowing the browser to continue and get your content loaded before it starts interacting with blocking processes (the script
calls). So your customers will "notice" a speedier site, simply because the page seems to load faster (when really you're just allowing it to load as it goes). Second, you wait until the body
tag is essentially fully loaded before you start monkeying with it's content.
It also looks like you're not using cache directives in your headers that tell the browser not to check for an update to a file for so long. So your Twitter and Facebook icons, it looks like the browser thinks it needs to recheck with the server, making a round trip that should, in theory, be unnecessary except only so often. But the time spent "waiting":
twitter_aqu_64.png 1.32s
ajax-loader.gif 1.31s
ssl-icon.jpg 1.31s
Now, I'm not an expert in cache-control; it's a little bit of a dark art. But those timings make me think there may be something going on there.
Try this answer for more information on cache-control:
HTML Cache control