HTTP errors – About 10% of the URLs I requested were unresolvable, unreachable, or otherwise refused my connection. A big part of that is due to Alexa basing its rankings on domains, not specific hosts. Even if a site only responds to www.domain.com, Alexa lists it as domain.com and my request to domain.com went unanswered.
At first, that may seem like an awful lot of potential error. However, the one thing all of these inaccuracies have in common is that none of them favor the case for using a public CDN.
I would have to disagree with the last paragraph there. I think that if one is so incompetent that his page is not available without the "www.", that there is a very strong chance that such person hasn't heard of a CDN. So domains that are not working without the "www." are, in my opinion, favouring the non CDN way.
I've started to see some best-practice-if-you-ignore-user-expectation guides out there which say that allowing the domain to ignore the www is Not A Good Idea. I don't really know why this is the case, though.
To be clear, I did not adjust any of my numbers to include an extrapolated extra 10%. Any numbers you see in my post are based on direct observation of a script tag's src attribute.
At first, that may seem like an awful lot of potential error. However, the one thing all of these inaccuracies have in common is that none of them favor the case for using a public CDN.
I would have to disagree with the last paragraph there. I think that if one is so incompetent that his page is not available without the "www.", that there is a very strong chance that such person hasn't heard of a CDN. So domains that are not working without the "www." are, in my opinion, favouring the non CDN way.