There's a lot of comments in here about how it's bad that cookies haven't always worked this way, but a significant amount of web content to this day still requires third-party cookies to work. And I'm not talking about cookies that are designed for analytics purposes; the discussions here where concern is raised revolve around simple things like logins breaking.
For greenhorn web developers, you could say the same thing about TLS certificates. Why weren't they always free?
Well, another reason is because TLS (and formerly SSL) wasn't (weren't) just about encryption, but about a "web of trust." Encryption alone isn't trust.
Many things about web technologies have changed over time; and it's easy to say that any individual piece of functionality should have worked this or that way all along, but the original intent of many web features and how those features are used today can be very different.
One day industry standards may dictate that we don't even process HTTPS requests in a way where the client's IP address is fully exposed to the server. Someone along the way might decide that a trusted agent should serve pages back on behalf of a client, for all clients.
After all, why should a third-party pixel.png request expose me browsing another website?! How absurd. Don't you think? And yet, we do it every day.
> Well, another reason is because TLS (and formerly SSL) wasn't (weren't) just about encryption, but about a "web of trust." Encryption alone isn't trust.
Which is a nice principle, but given corporate and government incentives, the trust provided was lackluster at best. The PKI is pretty much broken because of it.
In the end, all it did is incur an unaffordable cost for hobbyist bloggers and other netizens.
You used to be able to simply install a Firefox extension[1] or Android app[2] and automatically steal the accounts of everyone on your wifi network on every website. https stopped that.
Widespread https did that. Firesheep motivated the big players to stop cheaping out and go fully https unlike approaches which did https for login pages only but it took let's encrypt also for https to become truly widespread
Yeah, in the end it’s silly that we ended up with “trust” meaning only you’re connected to someone that controls the domain” which doesn’t actually need PKI to accomplish if we just supported a SRV record with the public key(s) and verifiably authoritative DNS queries.
Which fair it’s trading one PKI for another but web servers vastly outnumber authoritative DNS servers. But DKIM gets along fine without it so we probably could too.
Well, but there is nothing that makes it impossible to build logins and the like without 3rd party cookies. Yes, there are certain patterns out there, that use them, but slowly turning 3rd party cookies off and giving major sites time to adapt might help to dump 3rd party cookies one day completely.
I think the whole idea of sharing cookies across origins was a conceptual mistake right from the beginning, because it is also responsible for quite a lot of security vectors which had to be fixed by other mechanics like the SOP (Same Origin Policy) which in turn required mechanics like CORS (Cross Origin Resource Sharing).
And with all those mechanics in place, modern browsers are pretty tied up and are significantly reduced in their abilities compared to other HTTP/S clients. So when you want to build a PWA (Progressive Web App) that can use a configurable backend (as in federated), you will run into all kinds if problems, that can all be traced back to the decision to share cookies across origins.
My point is you can make these arguments all day. Why do we allow iframes? You could argue web servers should simply communicate among themselves and serve subdocuments to clients.
Why is HTTP/2 Server Push being rescinded?
Why do user agents not provide additional types for <script> based on runtime installations?
Why isn't there a renderer API that allows me to use painted trees in <canvas>, but there is a bluetooth API that no one uses?
I am not sure if I get your point then. I think it is important to see the patterns of bad decisions in the past to improve decision making in the future.
That those mistakes were not done deliberately and with good intentions is a completely different story and that in hindsight everything looks so clear is also well known ;-)
> a significant amount of web content to this day still requires third-party cookies to work.
Not in the corners of the web I frequent. I've been blocking 3rd party cookies for years and the only site that's broken was some Pearson online homework site.
A lot of IDPs break. For example any website that presents "Login with Google" will not work or require a reload after completing the Auth flow before the login is accepted.
This isn't simply "blocking third party cookies", it's "even an iframe has no access to the other state partition". The third party cookie is allowed to exist but it cannot leak to other sites. However, this leak prevention breaks plenty of other things if one is not careful (Mozilla was, there is a heuristic).
For greenhorn web developers, you could say the same thing about TLS certificates. Why weren't they always free?
Well, another reason is because TLS (and formerly SSL) wasn't (weren't) just about encryption, but about a "web of trust." Encryption alone isn't trust.
Many things about web technologies have changed over time; and it's easy to say that any individual piece of functionality should have worked this or that way all along, but the original intent of many web features and how those features are used today can be very different.
One day industry standards may dictate that we don't even process HTTPS requests in a way where the client's IP address is fully exposed to the server. Someone along the way might decide that a trusted agent should serve pages back on behalf of a client, for all clients.
After all, why should a third-party pixel.png request expose me browsing another website?! How absurd. Don't you think? And yet, we do it every day.