1

Im using gtmetrix.com to performance check my little website. Website is not something critical. Im just trying to learn a bit more about performance optimization. The last thing that I need to optimize is the browser caching. I have read and tried different things but no matter what I still get these warnings. Leverage browser caching for the following cacheable resources:

enter image description here

Its a brandnew MVC site hosted on Azure. In my web.config I have added this:

<caching>
  <outputCacheSettings>
    <outputCacheProfiles>
      <add name="CacheFor60Seconds" duration="360" location="Any"/>
    </outputCacheProfiles>
  </outputCacheSettings>
</caching>

<staticContent>
  <clientCache cacheControlMode="UseMaxAge" cacheControlMaxAge="3.00:00:00" />
</staticContent>

and my controller actions looks like this

    [OutputCache(CacheProfile = "CacheFor60Seconds")]
    public ActionResult Index()
    {
        return View();
    }

As far as I can see the resources will expire after 2 hours or 3 days. Why is this not good enough? I get the same result when using other performance checker tools.

Christian
  • 1,080
  • 1
  • 20
  • 37

2 Answers2

2

You are caching the result of the MVC Controller action (so the View) for 60 seconds. The warnings are all about static content, which you seem to be addressing with the clientCache element.

The first thing that comes to mind would be to use Azure CDN for caching and faster delivery.

Caching static content by using the clientCache element of the staticContent element in web.config should normally work, but is known to not always hold up their end of the bargain. Normally, this is a configuration issue. Please read the article I linked to, since it has hints and tips like

To use the httpExpires attribute, you need to set the value of the cacheControlMode attribute to UseExpires

and

The value for the httpExpires attribute must be a fully-formatted date and time that follows the specification in RFC 1123

Why is this not good enough?

The question why this isn't good enough can probably be best answered by the people over at gtmetrix.com, since it's their threshold. And I can imagine they have some tips & tricks to improve your results?

You might want to increase the MaxAge to cache static content and add a (fictitious) version number to your static content, like some_stylesheet.css?ver=3. This way it caches for longer periods and you can force browsers to get a new version of the file as soon as you want to by increasing the version number, like some_stylesheet.css?ver=4.

There's an interesting article by Mads Kristensen over here that uses fingerprinting: Cache busting in ASP.NET

rickvdbosch
  • 14,105
  • 2
  • 40
  • 53
2

As Rick pointed out in his answer your controller code only cache the view for that action so that code is not really relevant for the .jpg/.js files.

Your web.config settings for static content correctly set all static content to be cached for 3 days.

The reason the analytics.js file only says 2 hours is that you are most likely getting that file from external source (like from google) so its up to their cache settings. You can't change their settings for cache.

I refer to this question if you want to try and fix the 2 hours: PageSpeed Insights 99/100 because of Google Analytics - How can I cache GA?

On the question on Why is this not good enough, the answer is I am afraid it depends on the content.

You should focus on keeping static content cashed as long as possible, this since downloading the same media multiple times is not that optimal for your connection specially if you are on mobile devices and such. So images that rarly change could have cache time for like a year. If you use Google PageSpeed Tools they recommend at least a week. (GTmetrix uses PageSpeed and YSlow to check performance)

We recommend a minimum cache time of one week and preferably up to one year for static assets, or assets that change infrequently. If you need precise control over when resources are invalidated we recommend using a URL fingerprinting or versioning technique - see invalidating and updating cached responses link above.

And as Rick shows and PageSpeed recommends if you have content that change often you can use URL fingerprinting and use longer cache on those.

JohanSellberg
  • 2,423
  • 1
  • 21
  • 28