> Many will say, this is good, children should be protected. The second part of that is true. But the way this is being done won't protect children in my opinion. It will result in many more topic areas falling below the censorship threshold.
For example, YouTube currently has quite a lot of really good videos on harm reduction for drug users (and probably also a bunch that are not very good and/or directly misleading). I would expect all of such videos to be removed if such a child protection law was passed, because any neutral discussion of drug use apart from total condemnation is typically perceived as encouragement. That would deprive people of informative content which could otherwise have saved their lives.
All these concerns are muddled by thinking about Youtube as the example, since it is such a blind meta machine optimising for ad revenue, it’s already actively pushing all kinds of harmful content.
The problem with any laws for a good purpose is that, even if you can get everyone to agree on the general statement of a good purpose, there are disagreements on what actually counts as achieving the goal from both a moral and a scientific level.
For example, providing information on how to do something harmful X more safely might increase the risk of people doing X. On the moral side, someone might argue that even 1 more person doing X is worse than the reduction in harm of the others doing X. On the scientific side, there is likely not direct evidence to the exact numbers (ethical concerns with such research and all that), so you'll have some people disagreeing on how much the harm is increased or reduced and different numbers can both be reasonable but lead to different conclusions given the lack of direct research.
This all becomes supercharged when it comes to children, and you'll find people not even be consistent in their modes of thinking on different topics (or arguably they are consistent, but basing it off of unsaid unshared assumptions and models that they might not even be consciously aware of, but this then gets into a bunch of linguistic and logic semantics).
Big tech censorship disgusts me. Everything is completely backwards from what it should be, and the sheer scale of those platforms (bigger than many countries by population or money) prevents individual people and even governments from exercising meaningful democratic oversight. So these platforms congregate hundreds of millions of people and whatever their CEOs and/or douche tech bros in SV decide is what becomes law.
Another example: videos about the holocaust or WWII atrocities. Every one of them demonetised and hidden from recommendations because it touches a horrifying topic. Harms the children? On the contrary, nothing more important in an age of global fascism waves than a lesson in how it went last time.
Meanwhile the whole platform is a cesspool of addictive brainrot, gambling ads, turbo-consumerist toy unboxing videos, etc. Things that are actually truly harmful to kids. These are not restricted, these are promoted!
War is peace etc etc. Good is evil and evil is great. Everything is backwards.
The thing to understand is your last paragraph: everything big ads does is unsurprisingly focused toward making people into worse versions of themselves. You wouldn't let kids go to a casino or porn site for educational material. Don't let them use youtube either.
It could be that someone happened to post educational videos to the porn site. If so you'd might as well download them while you have the chance, but don't mistake their existence for some indication that that's what the site is for. They're still less than 0.1% of the videos, and you'd need to specifically search for them or be linked to them to find them. Assume you'll need to look elsewhere for educational material. e.g. there are 10s of thousands of results for videos for "Holocaust" on worldcat.
Problem is still not solved because search also returns a lot of garbage, and you don't want kids to be on a site that's 99% garbage. Xvideos could have a large library of science and history videos while still being 99% porn. Like I said, adults should download and curate the good stuff but recognize they're still in the seedy part of town, shouldn't let kids go there, and shouldn't expect it to be a platform for learning. That's just not its purpose. In fact youtube's purpose is basically the opposite of personal growth.
You can get literal pbs at pbs.org for $5/month, or your local library for free.
The problem is that YouTube and Google on a whole actively encourage the use of YouTube in schools, home schooling, and education in general. Google workspace for education is free for educational institutions. They also have a curated YouTube kids app with a giant feed of brain rot that they consider to be safe for kids, but only because the content doesn’t show anything graphic or have bad language.
On the other hand, porn services are (generally) actively blocked in educational institutions, so the content, regardless of its educational quality will never be suggested to kids because they are not a target audience. (Not to mention the legal trouble these services would have from actively enticing minors) I doubt we’ll see “PornHub for kids” our RedTube signing a contract with Blippi or Miss Rachel.
Half their business is propaganda (the other half being surveillance); of course they represent themselves as something positive. Recognize them for what they are. Point it out to others. Advocate for banning them in schools. Warn parents that youtube kids is not appropriate for children. They do near zero curation. They don't commission creation of educational content. They are nothing like PBS (as another commenter compared them to). More generally, ads are not child appropriate. These platforms have some useful content, but on the whole they undermine teaching virtues, and in fact their entire purpose is to push the opposite.
I agree 100%, and in another comment I also suggested an alternative to child protection laws, namely that we should severely restrict the viability of the ad tech business model altogether. While it does make certain niche content creation financially viable which otherwise wouldn't, in the grand scheme of things the negative externalities outweigh the good.
For example, YouTube currently has quite a lot of really good videos on harm reduction for drug users (and probably also a bunch that are not very good and/or directly misleading). I would expect all of such videos to be removed if such a child protection law was passed, because any neutral discussion of drug use apart from total condemnation is typically perceived as encouragement. That would deprive people of informative content which could otherwise have saved their lives.