Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Youtube has been repeatedly told about videos that are abusive towards children and they do nothing about it. They're not interested in effective solutions




Youtube has been repeatedly told about videos that are abusive towards children and they do nothing about it. They're not interested in effective solutions

Youtube is user-generated content which is precisely why I would prefer they add an RTA header. Random people uploading videos can claim to be kid friendly when they are not. Take that responsibility away from the uploaders and away from Youtube and hand it to the parents. Less work, liability and cost for Youtube should be a nifty incentive at the risk of blocking some advertising to children which is another loaded topic all together.


> Take that responsibility away from the uploaders and away from Youtube and hand it to the parents.

The system described still requires action by the webmaster. Their options are: deny the entire site to those sending an RTA header; evaluate the content themselves; or trust the uploader. (Or a combination: have uploaders opt-in to evaluation for a fee, with the content denied to kids by default.)


The client does not send an RTA header. The RTA header is only sent by the server or load balancer by design. Absolutely no action required by web site operators and owners assuming they enabled the header on any URL that is either adult or user-generated content.

It is up to the client what to do with the header which right now is nothing. A law would be required to get the snippet of code added to user agents. I estimate it would take an intern one afternoon to get it into the clients they support not counting dev/qa, management approval, etc...

Challenge to FAANG: Show off your interns! There is no harm in adding the code required to detect this header. Example header to detect sent from NGinx. If you detect this header activate nanny controls. To be safe do a separate parental_build to get manager approval.

     add_header Rating 'RTA-5042-1996-1400-1577-RTA' always;
All one need detect is: RTA-5042-1996-1400-1577-RTA

For fun, search for this on Shodan.


> The RTA header is only sent by the server or load balancer by design. Absolutely no action required by web site operators and owners assuming they enabled the header on any URL that is either adult or user-generated content.

The website owners and operators have to decide which URLs get the header. If the categorization is "either adult or user-generated content", then I already covered that for the case of YouTube: i.e., the entire site is denied to kids (whose parents opt in).


the entire site is denied to kids

I also covered that here [1]. Indeed if parents do not enable all of Youtube or Youtube does not move most adult content into a unique URL or their server does not send the header for anything flagged as adult the kids will not be advertised to. They would have to go to a kid friendly site that moderates before a video is viewable or Youtube would have to change moderation tactics. Kids need not visit Youtube. There are kid friendly sites.

[1] - https://news.ycombinator.com/item?id=46152727


Out of curiosity, who would YouTube implement an RTA header? Which resources would have the header and which wouldn’t?

Out of curiosity, who would YouTube implement an RTA header?

Their app developers unless it is set globally and in that case their network engineering team.

Which resources would have the header and which wouldn’t?

If the app developers send the header on any video flagged as adult then just specific videos. If they created a unique URL that all adult content would reside under then it could potentially be the network engineers. It really depends on how much work they put into it so that more people could view the content assuming user agents become legislated to check for the header.


I mean they have invested a ton into their kid-friendly mode and there have been quite a number of “adpocalips” where ad revenue for many content creators was dramatically slashed due to YouTube’s over-zealous moderation.

It is a serious business concern, there are occasional panics triggered by consumers complaining that a brand ad is shown next to and benefits from the attention of some distasteful content, and they start to bleed important advertisers on mass. YouTube then proceeds to get defensive and demonetizes (removes all ads from) or tags as adult-only any video that may be concerning, where avoiding false negatives takes much more precedence over avoiding false positives.

Of course this is not directly tied to protecting children, but this incentive structure is partly aligned and it is a strong one.


Their kid friendly mode is still completely full of absolute crap that you wouldn't want your kid to see.

It is definitely mind-rotting crap, but I think they are very strict with technically inappropriate content.

I agree that they are not doing a good thing, but one can’t say they aren’t doing massive efforts around it either.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: