- Browsers have a limit of concurrent connections to one server. RFC 2616 specifies HTTP and says: "A single-user client SHOULD NOT maintain more than 2 connections with any server or proxy." Seamonkey 2.0.3 and Firefox 3.6.2 seem to open up to 15 connections to one host, Opera 10.54 opens up to 16 as default. With the arrival of Internet Explorer 8 Microsoft changed that number from 2 connections in IE 7 (for HTTP 1.1) to 6 connections in IE 8 (HTTP 1.1).
- As your servers will not deliver the library, your servers are less loaded and you consume less bandwidth.
But there are some drawbacks, too: You're dependent on the CDN infrastructure. If they are down, your website will be hardly usable, too. You cannot influence bandwidth and carriers of the delivery prodiver's network although the sense of a CDN is to have good connectivity and only few hops to clients.
Independent of the drawbacks (which are minor in my opinion) I wondered about the effectiveness of the caching-bonus. There are many people out there that don't want a CDN to deliver arbitrary code to clients and therefore still host the libraries themselves. Another problem is that there are many versions of such libraries. If example.com uses `lib-1.1.js` and you linked `lib-1.2.js`, the caching will fail.
Facts about public CDN usage
In order to shed light on this, I checked the top 500 websites globally as well as the top 220 websites in Germany according to Alexa. So I wrote a script that fetches the domains listed at Alexa and another one that parses those websites. I wanted to know which of the top sites use a hosted version of jQuery. Here come the results:
Out of the german sites merely six use a CDN. All of them make use of Google. Looking at the top 500 sites globally I recognized that 13 use Google while Edgecast is used by two sites. Microsoft CDN is not used at all.
I wondered that only few sites make use of the free CDN services. Maybe the top websites tune their homepages for faster load times and therefore exclude the libraries there? Maybe I should not only crawl the homepage? Is jQuery not used at all?
The sites I got from Alexa are a nice starting point, but in order to gain more precise numbers we should crawl more websites probably. In Germany an association called IVW measures traffic for many German websites. I don't know if there is something similar for your country or even on a global scope. If someone has a good list: just drop me a line.
It seems that even though Edgecast might be a bit faster to load Google is more common at the sites I crawled. Maybe you should pay special attention to your audience: If you have a regional website, you should look at popular, relevant sites that might be in the history of your visitors.
If you have no high-security (Ajax-) banking website or other reasons for not using public CDNs the Google one seems to be a quite good choice.
You can follow any response to this post through the Atom feed.