1

I am currently trying to performance tune the UI of a company web application. The application is only ever going to be accessed by staff, so the speed of the connection between the server and client will always be considerably more than if it was on the internet.

I have been using performance auditing tools such as Y Slow! and Google Chrome's profiling tool to try and highlight areas that are worth targeting for investigation. However, these tools are written with the internet in mind. For example, the current suggestions from a Google Chrome audit of the application suggests is as follows:

Network Utilization

  • Combine external CSS (Red warning)
  • Combine external JavaScript (Red warning)
  • Enable gzip compression (Red warning)
  • Leverage browser caching (Red warning)
  • Leverage proxy caching (Amber warning)
  • Minimise cookie size (Amber warning)
  • Parallelize downloads across hostnames (Amber warning)
  • Serve static content from a cookieless domain (Amber warning)

Web Page Performance

  • Remove unused CSS rules (Amber warning)
  • Use normal CSS property names instead of vendor-prefixed ones (Amber warning)

Are any of these bits of advice totally redundant given the connection speed and usage pattern? The users will be using the application frequently throughout the day, so it doesn't matter if the initial hit is large (when they first visit the page and build their cache) so long as a minimal amount of work is done on future page views.

For example, is it worth the effort of combining all of our CSS and JavaScript files? It may speed up the initial page view, but how much of a difference will it really make on subsequent page views throughout the working day?

I've tried searching for this but all I keep coming up with is the standard internet facing performance advice. Any advice on what to focus my performance tweaking efforts on in this scenario, or other auditing tool recommendations, would be much appreciated.

Nate
  • 30,286
  • 23
  • 113
  • 184
  • 1
    @bwheeler96: be nice. This site isn't just for hobbyists – Sheena Nov 05 '12 at 16:54
  • You need to specify: is this a load-time problem or an performance issue with the app AFTER it is already loaded? – Diodeus - James MacFarlane Nov 05 '12 at 17:06
  • Diodeus, it's a load-time problem. Once the application has loaded the UI performs fine. However, it can take many seconds to load. Some pages currently require 50+ GET requests to display, and even on the fast connection to the server (or even when running a server locally) it takes some time to load all of the resources. I'm looking for some pointers as to what is worth prioritising in the investigation. – user1018494 Nov 05 '12 at 17:12
  • 1
    @bwheeler96, whilst I understand your sentiment sometimes business requirements mean people end up working outside of their domain knowledge areas. That’s what I’m doing right now. What’s wrong with asking on Stack Overflow for advice, as opposed to wasting company time tweaking areas that will give negligible performance increases? – user1018494 Nov 05 '12 at 17:16

3 Answers3

3

One size does not fit all with these things; the item that immediately jumps out as something that will have a big impact is "leverage browser caching". This reduces bandwidth use, obviously, but also tells the browser it doesn't need to re-parse whatever you've cached. Even if you have plenty of bandwidth, each file you download requires resources from the browser - a thread to manage the download, the parsing of the file, managing memory etc. Reducing that will make the app feel faster.

GZIP compression is possibly redundant, and potentially even harmful if you really do have unlimited bandwidth - it consumes resources both on the server and the client to compress the data. Not much, and I've never been able to measure - but in theory it might make a difference.

Proxy caching may also help - depending on your company's network infrastructure.

Reducing cookie size may help - not just because of the bandwidth issue, but again managing cookies consumes resources on the client; this also explains why serving static assets from cookie-less domains helps.

However, if you're going to optimize the performance of the UI, you really need to understand where the slow-down is. Y!Slow and Chrome focus on common problems, many of them related to bandwidth and the behaviour of the browser. They don't know if one particular part of the JS is slow, or whether the server is struggling with a particular dynamic page request.

Tools like Firebug help with that - look at what's happening with the network, and whether any assets take longer than you expect. Use the JavaScript profiler to see where you're spending the most time.

Neville Kuyt
  • 29,247
  • 1
  • 37
  • 52
  • Thanks, that's really helpful. Would GZIP compression be beneficial in this situation if it reduces the number of parallel downloads? Or does the number of parallel downloads only really matter when bandwidth is limited? – user1018494 Nov 05 '12 at 17:19
  • Parallel downloads usually help performance - if you structure them cleverly. They allow the browser to download multiple files at the same time, and start to parse them as they are downloaded. If you're not bandwidth constrained, this means that the browser can do more at the same time. GZIP compression isn't really related to this - it reduces the file size for individual assets, and thus allows them to be downloaded faster, at the expense of some CPU and memory to manage the decompression on the client... – Neville Kuyt Nov 05 '12 at 17:28
  • GZIP compression would appear to be totally redundant for a intranet application then. The initial hit of downloading a slightly larger file, which is then cached for later use, is going to be a lot less painful than compressing it on the server and decompressing it on the client. For anyone else stumbling across this SO question, I've also just found this http://stackoverflow.com/questions/2707499/improving-javascript-load-times-concatenation-vs-many-cache which may prove useful. – user1018494 Nov 05 '12 at 17:46
1

Most of these tools provides steps or advice for one time check. However it solves few issues, it does not tell you how your user experiences your site. Always Real user monitoring is a right solution to measuring live user performances. You can use Navigation Timing API to measure page load time and resource timings.

If you want to look for service, you can try https://www.atatus.com/ which provides Real User monitoring, Ajax Monitoring, Transaction monitoring and JavaScript error tracking.

Fizer Khan
  • 88,237
  • 28
  • 143
  • 153
0

Here is a list of additional services you can use to test website speed: http://sixrevisions.com/tools/free-website-speed-testing/

andreimpop
  • 331
  • 2
  • 8
  • Those look pretty INTERnet application orientated (as opposed to INTRAnet application orientated) – Sheena Nov 05 '12 at 17:06
  • Thanks for that. Unfortunately a lot of these tools require a publicly accessable URL to run against. Because this is an internal application it's just not going to be possible to set up a public-facing server just for the sake of running auditing tools against it. Do you have any reccomendations for offline tools, other than the ones I mentioned in my original post? – user1018494 Nov 05 '12 at 17:08