Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm going to be inevitable in the opposite direction: I don't think cross-domain requests were actually saving that much bandwidth. The common use case I could think of that would be JavaScript CDNs. The problem with that is that JS libraries update frequently - even something really common like jQuery has hundreds of releases, all of which get their own separately-cached URL. So the chance of two sites using the same jQuery version is low. Keep in mind that public JS CDN URLs are rarely refreshed, too - it's more of an indicator of when the site was developed rather than the latest version the site was tested with. So you could hit hundreds of sites and not get a cross-domain cache hit.

Even if you did share a URL with another site, the benefit is low compared to what you can do with same-domain requests. Most sites should be served with HTTP 2 already, which means even unoptimized sites should still load decently fast as requests aren't as expensive as they used to. You can get almost all of the same bandwidth benefits from a cross-domain cache by just making sure your own resources are being cached for a long time.



Mozilla ran the numbers and it's not a huge penalty.

It's just frustrating that it's one more optimization that is getting turned off. And makes the internet just a tiny bit worse as a result. It's like death by a thousand cuts.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: