I decided it's outside the scope of that post (which was already laboriously long winded), but you can/should use a fallback technique like this to mitigate the potential for Google downtime: http://weblogs.asp.net/jgalloway/archive/2010/01/21/using-cd...
This technique doesn't address the right problem, which might be a misunderstanding in statistics.
That is, no matter how reliable the other source (Google etc) is, it is still below 100%. If you host the JS on your own, your site becomes slow only if your server hangs. However, if you host the JS somewhere else, your site becomes slow whenever your server or the other server hangs. The probability of the latter is always greater than the former, i.e. you don't really gain anything.
So this trick slightly improves the good cases, but increases the likelihood of the bad cases. That kind of trade-off isn't desirable. Usually, people design trade-offs for the exact opposite: scarifying the speed of the normal case (which should be more than fast enough anyway) in order to decrease the probability of the worst case.
(BTW, this is true for almost all long-living projects. The opposite strategy makes only sense in "car racing" like situations where you either win fast, or lose everything. However, hardly any website is designed to live only a few weeks or so.)
Hosting jquery on your website increases the load on your website, this increasing the probability that your website will fail.
The more sites that use the CDN, the more likely someone coming to your site already has jquery in their cache.
If you're just trying to minimize downtime, regardless of cost then I absolutely agree with your analysis. However if you're trying to minimize something more complex involving downtime and cost, then maybe there's a point where it makes sense to use the CDN. Something very high volume like Twitter, for example.
I've thought to myself, "I wish there was a timeout attribute on the <script> tag," about once every few months for the past 10 years. Is there any good reason you can't manually specify how long you want the browser to wait for an external file to load?
That fallback technique mitigates the majority of failure cases though (NoScript, overbearing firewall, blocked regions, etc). The CDN itself being slowly-down is vanishingly rare.
"The CDN itself being slowly-down is vanishingly rare."
Based on...?
It happened to me once a few months ago. It was down for hours. The negative impact was very real and painful and in my opinion outweighed the other advantages of hosting using Google's CDN.
Pingdom tested Google, Microsoft, and Edgecast's jQuery CDNs every minute for a couple weeks and found all of them averaged between 100-150ms to download jQuery[1]. Google's was actually the slowest of those, averaging a turtle's pace of ~130ms from all of Pingdom's datacenters. They're all so close that the Google CDN's overwhelming caching advantage should be preferable though.
More anecdotally, I've been running a few Pingdom type tests myself for a longer period, using uptime tools on few of my servers and mon.itor.us. Except for that brief outage the morning of May 14, 2009[2], I haven't monitored a net-wide outage or even a 250+ ms slowdown.
I'd be genuinely interested in any concrete data to the contrary.
I don't know, anecdotal evidence here, but the majority of cases I see pages taking a long time to load due to 3rd party js etc, it's waiting for actual data, rather than anything else.
The OP said:
"and I noticed the render time of my site (and a few others) jump up anywhere between 5x and 100x."
Which would indeed suggest that it did load, but at significantly reduced speeds.
These highly-available, distributed CDNs hosting static content don't have the same failure characteristics as something like an advertising script or Twitter widget. Where the latter do often hang the page (frustratingly), the popular CDNs that host jQuery aren't prone to that under any but the rarest of circumstances.
Maybe unrelated, but you'd be surprised how often the "Waiting for domain.com..." in your browser's status bar is misleading. Interactions between externally referenced scripts, images, and scripts that use document.write can produce "interesting" results in most browsers.
Also, HTML5 Boilerplate has that built-in: http://html5boilerplate.com/