35

I am trying to quantify "site slowness". In the olden days you just made sure that your HTML was lightweight, images optimized and servers not overloaded. In high end sites built on top of modern content management systems there are a lot more variables: third party advertising, trackers and various other callouts, the performance of CDN (interestingly enough sometimes content delivery networks make things worse), javascript execution, css overload, as well as all kinds of server side issues like long queries.

The obvious answer is for every developer to clear the cache and continuously look at the "net" section of the Firebug plugin. What other ways to measure "site dragging ass" have you used?

deadprogrammer
  • 11,253
  • 24
  • 74
  • 85

14 Answers14

29

Yslow is a tool (browser extension) that should help you.

YSlow analyzes web pages and why they're slow based on Yahoo!'s rules for high performance web sites.

Will
  • 24,082
  • 14
  • 97
  • 108
johnstok
  • 96,212
  • 12
  • 54
  • 76
  • 1
    @kohlerm A lot of time is passed since your comment. Now the Yslow is not only supported by Firefox: e.g., I use it on Chrome! – JeanValjean Feb 10 '13 at 10:46
11

Firebug, the must have for web developers Firefox extension, can measure the loading time of different elements on your webpage. At least you can rule out CSS, JavaScript, and other elements taking too much time to load.

If you do need to shrink JavaScript and CSS loading times, there are various JavaScript and CSS compressors out there on the web that simply take out unnecessary text out of them like newline characters and comments. Of course, keep an ordinary version on the side for development sake.

If you use PNGs, I recently came across a PNG optimizer that can shrink PNG sizes called OptiPNG.

Ido Schacham
  • 131
  • 2
8

"Page Load time" is really not easy to define in general. It depends on the browser you use, because different browsers may do more requests in parallel, because javascript has differents speeds in different browsers and because rendering time is different.

Therefore you can only really measure your true page load time using the browser you are interested in. The end of the page load can also be difficult to define because there might be an Ajax request after everything is visible on the page. Does that count the the page load or not?

And last but not least the real page load time might not matter that much because the "perceived performance" is what matters. For the user what matters is when sHe has enough information to proceed

Markus

I'm not aware of any way (at least no I could tell you :] ) that would automatically measure your pages perceived load time.

Use AOL Pagetest for IE and YSlow for firefox (link see above) to get a "feeling" for you load time.

kohlerm
  • 2,596
  • 1
  • 17
  • 22
5

Get yourself a proper debugging proxy installed (I thoroughly recommend Charles)

Not only will you be able to see a full breakdown of response times / sizes, you can save the data for later analysis / comparison, as well as fiddle with the requests / responses etc.

(Edit: Charles' support for debugging SOAP requests is worth the pittance of its shareware fee - it's saved me a good half a day of hair-loss this week alone!)

Community
  • 1
  • 1
Ian
  • 4,208
  • 21
  • 33
5

I routinely use webpagetest.org, which you can use to perform performance tests from different locations, on different browsers (although only msie 7-9), with different settings (number of iterations, connection speed, first run vs 2nd visit, excluding specific requests if you want, credentials if needed, ...).

the result is a very detailed report of page loading time which also provides advise on how to optimize.

it really is a great (free) tool!

futtta
  • 5,917
  • 2
  • 21
  • 33
  • ow, this was an old question! weird (inconvenient) that SO's rss feeds include old questions whenever someone posts a reply ... – futtta Jan 12 '11 at 09:48
4

Last time I worked on a high-volume website, we did several things, including:

If you want a quick look, say a first approximation, I'd go with YSlow and see what the major factors affecting page load time in your app are.

UBIK LOAD PACK
  • 33,980
  • 5
  • 71
  • 116
Dafydd Rees
  • 6,941
  • 3
  • 39
  • 48
3

Well, call me old fashioned but..

time curl -L http://www.example.com/path

in linux :) Other than that, I'm a big fan of YSlow as previously mentioned.

f4nt
  • 2,661
  • 5
  • 31
  • 35
3

PageSpeed is an online checking tool by Google, which is very accurate and reliable:

https://developers.google.com/pagespeed/

Simon Steinberger
  • 6,605
  • 5
  • 55
  • 97
1

If it's asp.net you can use Trace.axd.

Yahoo provide yslow which can be great for checking javascript

dove
  • 20,469
  • 14
  • 82
  • 108
1

YSlow as mentioned above.

And combine this with Fiddler. It is good if you want to see which page objects are taking the most bandwidth, which are being compressed at the server, unexpected round-trips, and what is being cached. And it can give you a general idea about processing time in the client web browser as compared to time taken between server & client

James Gardner
  • 671
  • 4
  • 21
0

Apache Benchmark. Use

ab -c <number of CPUs on server> -n 1000 url

to get good approximation of how fast your page is.

bh213
  • 6,343
  • 9
  • 43
  • 52
0

In Safari, the Network Timeline (available under the Develop menu, which you have to specifically enable) gives useful information about loading time of individual page components, as well as showing when each component started loading.

TimB
  • 5,714
  • 2
  • 26
  • 30
0

Yslow is good, and HttpWatch for IE is great as well. However, both miss the most important metric to a user "When is the page -above the fold- ready for use to the user?". I don't think that one has been solved yet...

Jilles
  • 744
  • 4
  • 11
0

There are obviously several ways to identify the response time, but the challenge has always been how to measure the rendering time that is spent in browser.

We have a controlled test phase in which we use several automated tools for testing the application. One of the output we generate from this test is a fiddler trace for each transaction (a click). We can then analyse the fiddler trace to understand the Time for last byte and subtract it with the overall time the page took.

Something like this 1. A= Total response time as measured by the an automated tool (in our case we use QTPro) 2. B= Time to last byte (Server + Network time, from the fiddler trace) 3. C= A-B (approx Rendering time, OR the time spent in browser)

All the above I explained can be made a standard test process and end of the test we could generate a break-up of time spent at each layer e.g. rendering time, network time, database calls etc...

Mouli
  • 165
  • 2
  • 12