28

On a typical webpage I load the following, all from CDNs:

  • jQuery
  • Angular
  • Bootstrap
  • Icomoon
  • a few Angular plugins

Is it better to load these from 1 CDN (if possible), or from different CDNs? Are there best practices for this, or does it not make a difference?

Arya McCarthy
  • 8,554
  • 4
  • 34
  • 56
Jimmery
  • 9,783
  • 25
  • 83
  • 157

9 Answers9

16

In terms of using one or multiple CDN, it wouldn't be an issue depending on how many components you are downloading from the same hostname, according to this article from Yahoo UI Team, HTTP/1.1 suggests that browsers should limit parallel downloads to two per hostname. Therefore, using multiple CDN sources, it is, different hostnames should be a good practice.

Maybe in case of using related components just to avoid accidentally version mismatch like angular and angular-router for example, you might want to use the same CDN, but, if the download per hostname increases it would create loading leaks the same way (at least for browsers that follows the spec suggestion).

Using a CDN is definitely a good practice to increase loading performance of your web site. However, you should consider using the more popular CDNs you can find, it would increase the chances you get a cached version of the files you are using from a different site that uses the same file, which would increase even more the loading performance of the website.

As @JeffPuckett pointed out in the comments, browsers have a higher limit of simultaneous download per server/proxy today's days:

Firefox 2:  2
Firefox 3+: 6
Opera 9.26: 4
Opera 12:   6
Safari 3:   4
Safari 5:   6
IE 7:       2
IE 8:       6
IE 10:      8
Chrome:     6

Ref.: https://stackoverflow.com/a/985704/4488121

lenilsondc
  • 9,590
  • 2
  • 25
  • 40
  • Also, the limit of parallel host downloads varies per browser beyond the http specification. https://stackoverflow.com/q/985431/4233593 – Jeff Puckett Sep 05 '17 at 11:24
  • 1
    Exactly, as per the spec **suggests** that borwsers should limit parallel downloads to two per hostname (that is only firefox 2 and IE 7 according to your link), that's why it's a "should" but not a "shall". That was a suggestion from HTTP/1.1 which doesn't exist in the spec anymore, but old browsers HTTP/1.1 compliant might have this limitation. Nowadays browsers have figured that allowing more parallel downloads would increase significantly the loading time, so honestly you'd have to load a huge amount of files from the same CDN for that to become an issue, like up to 8 files for example. – lenilsondc Sep 05 '17 at 12:34
5

I'm afraid there is no silver bullet answer to this question as it usually happens. Here are my 2 cents. In oppose to focusing on the amount of simultaneous connections and browsers/standards I want to look at it from a different perspective.

What matters most for both your users and your server is page load time and service availability. Fastest load time is load from cache. The more of your users use specific file from a particular CDN, the more chances of the cache hit.

Based on this goal, it makes sense to

  • Load popular libraries from popular CDN(s), which depends on the library list can be the same CDN or different CDNs. I would argue, that number of parallel HTTP connections of a browser is secondary argument.
  • Join and minimise custom scripts and rarely used 3rd party libraries into as few files as make sense (for example single CSS and single JS) and load from your own host or own CDN (if you have tons of users coming from different locations or even continents CDN is probably isn't luxury for you).
  • If CDN-based scripts were not loaded from CDN for whatever reason, fallback to a local copy.
  • Should you have that option, select most used version of the libraries, which most likely is not the latest one.

I would categorize libraries for which you can find CDNs and statistics of usage to be popular, others - not so much, although you can decide for yourself what to host locally based on other recommendations.

For statistics, you may want to use something like w3techs:

Picking between "few high traffic" and "many low traffic" sites could be done based on some educated guess about your web site audience, but if you want to make sure, you can try to measure cache hit ratio. It's not straightforward, but here us some idea.

Now, for the versions, should you have an option to change the versions. If you decide to go with the first option "few high traffic" sites, it's definitely worth checking which version of the library from the CDN they use. Otherwise, for "many low traffic" option, the most popular version is preferable. Same w3tech should have statistics.

It might sound like a lot of trouble, but it's done rarely (if not once), since statistics tend to change quite slowly.

isp-zax
  • 3,833
  • 13
  • 21
5

Accepted answer is so outdated and the reference document from Yahoo was published in 2007 and it was specifically relevant to HTTP/1.1. See correct information for HTTP/2: Does the per-host connection limit is raised with HTTP/2?

I would say exactly the opposite. It is best to load more and more of the resources from the same Host, in this case a CDN, if it support HTTP/2. The browser support for HTTP/2 is pretty high and will reach close to 100% in few more years (if combined with HTTP/3).

Also, there are costs associated with using more CDNs to load files:

  1. Multiple DNS lookups.
  2. Multiple Connections per pageload.
  3. Not optimal for mobile devices.

Now, answering your specific questions:

Is it better to load these from 1 CDN or different CDNs? Since all CDNs these days support HTTP/2 by default, it is recommended to use one CDN for more and more content if possible. By the way, there is a CDN, PageCDN, that was built just to solve this Incidental Domain Sharding issue. They provide javascript libraries, Fonts and Private CDN on a single host to get the most of the single HTTP/2 connection.

Are there best practices for this? The best practice is: 1. Reduce DNS lookups. 2. Reduce connections. Prefer less CDNs/Hosts over more CDNs/Hosts. 3. DO NOT BUNDLE FILES. Many developers will recommend you to bundle files, but the Google Chrome V8 team does not recommend this. 4. Use preconnect meta tag to save the connection time.

Does it not make a difference? Yes, using CDN makes a difference as it can use other website's browser cache for your purposes so that even your first time visitors get very fast page loads.

Ibtsam Ch
  • 383
  • 1
  • 8
  • 22
2

You should avoid directly use multiple CDN links in single page as it affects performance.You can try below code for adding multiple CDN links

bundles.UseCdn = true;
bundles.Add(new ScriptBundle("~/bundles/jquery",
@"//ajax.aspnetcdn.com/ajax/jQuery/jquery-1.10.2.js"
).Include(
"~/Scripts/jquery-{version}.js"));

It combines multiple files into a single file, it reduces the number of requests to the server that are required to retrieve and display a web page.

lazydeveloper
  • 891
  • 10
  • 20
2

Using a CDN is the way to go as already answered by others - but adding more CDNs will only scale up to the point where DNS lookup times for each CDN might start to dominate the total download time.

So trying to max out each CDN by trying to load at least 2 - 6 resources would probably be the best trade off.

light_303
  • 2,101
  • 2
  • 18
  • 35
0

A good question to start asking now that the majority of browsers support HTTP2 is: does my CDN support HTTP2? If so it's probable that pipelining those requests through 1 CDN is faster. It could also save DNS lookups IF the other CDNs aren't already cached. If the CDN supports push it could also increase speed since a lot of those libraries require multiple files.

pucky124
  • 1,489
  • 12
  • 24
0

Best practice is to use both CDN's and fallback(Local Files). To answer your question you should use multiple CDN paths as if one CDN is down at least the rest are up and working. For the down CDN the fallbacks will workout.

In short use both thats the best practice i follow in my projects.

Krishna9960
  • 529
  • 2
  • 12
-2

You could use cdn easily for fetching data

Prashant
  • 151
  • 1
  • 11
-3

You should try to avoid using CDN as:-

  1. If CDN is done due to some reason it will also affect your app also and you may also face a downtime which should never occur in a production code
  2. for CDN, browser needs to crawl for different domains and then download the necessary files which will take more time
  3. No of request to different domains will also slow your app load time

Browser has a limit of 6 requests at a time, so if you make many requests at a time, they will be in the queue and will take more time, so try to minimize the number of requests.

Recommendation -

User Bower/Grunt or Webpack in your application -

  1. Bower will bring all the dependencies of your application on your local server/machine so that it need not search different domains to get the required files
  2. Grunt will concatenate and merge all the different files into one file so its size decreases and less number of requests are made from browser to download files.

If you have any doubt about bower/grunt, I can help you in that.

  • 2
    This is bad advice - if a CDN is down, any competent programmer should have fallbacks in place to load local versions of dependencies. CDNs also will likely be in a user's browser cache, thus decreasing load times slightly. – Adam Sep 07 '17 at 19:04
  • Instead of managing fallbacks we can use local version of dependencies directly where we can minimize and concatenate all the dependencies and load a single file, I think this will be faster if you have a lot of dependencies. – Shubham Tripathi Sep 08 '17 at 06:26
  • 2
    "Browser has a limit of 6 requests at a time" - per server - have a look at this question: https://stackoverflow.com/questions/985431/max-parallel-http-connections-in-a-browser – Jimmery Sep 08 '17 at 13:29