Hacker Newsnew | past | comments | ask | show | jobs | submit | JohnMakin's commentslogin

Early 2024 if you had speculated about this about Persona's broader goals you would have been called nuts. It has become increasingly obvious though.

This seems to make the classic mistake that everyone makes when they conflate two things as the same - programming and business logic/knowledge (and I'd also throw in complex systems knowledge there too).

Often, understanding the code or modifying it is the easy part! I'm sure a decent amount of people on this website could master COBOL sufficiently to go through these systems to make changes to the code.

However, if I understand from my own career enough, knowing why those things are there, how it all fits together in the much broader (and vast) system, and the historical context behind all of that, is what knowledge is being lost, not the ability to literally write or understand COBOL.


> knowing why those things are there

I'm pretty sure they're talking about converting COBOL to Python or Go and that is the benefit. That doesn't require knowing the architecture and system design. I'm not familiar with COBOL and COBOL systems so I could be wrong... but Python programmers who can then study the system are easy to find.


This is fintech - I've not worked in banking specifically, but fintech (or fintech adjacent) most of my career, and from my POV these things can get insanely complicated in very unintuitive ways because the financial world is messy and complicated.

I've never worked on COBOL systems specifically, but just going from my experience working on fintech problems in dense legacy stacks of various languages (java is common), that are extremely hard to understand at times, the language itself is rarely if ever the problem.

"Just need to convert it to Go or Python" is kind of getting at the fallacy I am trying to describe. The language isn't the issue (IME). I do have my gripes about certain java frameworks, personally, but the system doesn't get any easier to understand from my POV as to simply rewrite it in another language.

Even let's say it was this simple in the case of COBOL - these are often extremely critical systems that cannot afford to fail or be wrong very often, or at all, and have complex system mechanisms around that to make it so that even trying to migrate it to a new system/language would inevitably involve understanding of the system and architecture.


That's true. COBOL is pretty easy to read so language is not the problem. The project then becomes a rewrite and that's almost never a good idea. Perhaps in the future when AI can convert the software and verify the logic.

> knowing why those things are there, how it all fits together in the much broader (and vast) system, and the historical context behind all of that, is what knowledge is being lost

How big is your context window? How big is Claude's context window? Which one is likely to get bigger?


RAM had been sold out so…

I call it the "how hard could it be" fallacy.

Right in the "it takes 15 minutes to do it" category.

Sure yet admin vs engineering in terms of jobs ... one is now on the decline either slowly or quickly. Now it requires 1/4 to 1/2 of the engineers once employed in the profession. I dont see how that's a good thing for any economy.

Due to existing health concerns, I self isolated in my home from the start of the covid outbreak in 2020, until spring 2021 when the first vaccines became broadly available. I only recall leaving a handful of times to pick up medicine when delivery wasn't an option, but other than that, almost no venturing into the outside world other than the few steps on my front porch to grab deliveries and groceries.

I learned a lot about myself. I love being alone, more than most people, but after a few months I did start to feel I was going a bit crazy. This was made worse by the fact at that point in my life I had a big drop off in friend groups (mostly people getting married/moving/having kids/etc.) My health and hygiene definitely suffered. What was the tipping point for me, and I'm still unsure to this day, but I felt I was having auditory hallucinations (mostly hearing my name). I ended up joining a group video-chat app that pitched itself as unofficial group therapy, and things improved a lot. It was this way I learned I'm not actually an extreme introvert like my doctor liked to tell me I was, and consider myself far more extroverted than I used to since this experience. Since then I make an effort to socialize once a week even if I really don't feel like it.


Seems pretty similar to be. I do not like people, but I don't dislike talking to people.

Speculation: What you actually like is independence, not being bound by others or being chained by those around you - the feeling of freedom. However, it does make you go a bit crazy so I do recommend getting at least something to take care of like a pet or even some plants since humanity historically relies on a purpose and the lack of it has pretty severe side effects. By chaining yourself and getting something to take care of is an easy way to make it less impactful while still experiencing that freedom.


Huh? even if you knew and understood the scope of it before (I’d say a vast majority did not and thought they were just red light cameras), it is not very hard to understand that when you see the people in masks without badges snatching your neighbors haphazardly and with specious reasons that you might make a chunk of that majority look at the cameras more skeptically and maybe, just maybe wonder if that technology could be turned against you too.

Until recently very few people could articulate the real risk this tech posed, now you can literally see it play out (depending where you live)


Note that there are sites like this that are "legit" in the sense they just forward your request to the .gov site for an extra fee, and you do get your documents later - I made this mistake with a birth certificate request once on one of these types of sites.

We'll try everything, it seems, other than holding parents accountable for what their children consume.

In the United States, you can get in trouble if you recklessly leave around or provide alcohol/guns/cigarettes for a minor to start using, yet somehow, the same social responsibility seems thrown out the window for parents and the web.

Yes, children are clever - I was one once. If you want to actually protect children and not create the surveillance state nightmare scenario we all know is going to happen (using protecting children as the guise, which is ironic, because often these systems are completely ineffective at doing so anyway) - then give parents strong monitoring and restriction tools and empower them to protect their children. They are in a much better and informed position to do so than a creepy surveillance nanny state.

That is, after all, the primary responsibility of a parent to begin with.


I know this is weird, but I'm in some ways not really sure who is on the side of freedom here. I get your position, but like. The whole idea of the promise of the internet has been destroyed by newsfeeds and mega-corps.

There is almost literally documented examples of Facebook executives twirling their mustaches wondering how they can get kids more addicted. This isn't a few bands with swear words, and in fact, I think that the damage these social media companies are doing is in fact, reducing the independence teens and kids that have that were the fears parents originally had.

I dunno, are you uncertain about your case at all or just like. I just like, can't help but start with fuck these companies. All other arguments are downstream of that. Better the nanny state than Nanny Zuck.


> I just like, can't help but start with fuck these companies. All other arguments are downstream of that.

The solution would then be to break them up or do things like require adversarial interoperability, rather than ineffective non-sequiturs like requiring them to ID everyone.

The perverse incentive comes from a single company sitting on a network effect. You have to use Facebook because other people use Facebook, so if the algorithm shows you trash and rage bait you can't unilaterally decide to leave without abandoning everyone still there, and the Facebook company gets to show ads to everyone who uses it and therefore wants to maximize everyone's time wasted on Facebook, so the algorithm shows you trash and rage bait.

Now suppose they're not allowed to restrict third party user agents. You get a messaging app and it can send messages to people on Facebook, Twitter, SMS, etc. all in the same interface. It can download the things in "your feed" and then put it in a different order, or filter things out, and again show content from multiple services in the same interface, including RSS. And then that user agent can do things like filter out adult content, if you want it to.

We need to fix the actual problem, which is that the hosting service shouldn't be in control of the user interface to the service.


Indeed "Interoperability" is what would hurt social media giants the most - Cory Doctorow recently held an excellent talk where he stated that back in the early 00s Facebook (and others) used interoperability to offer services that allowed to interact, push and pull to mySpace (the big dog back then) to siphon off their users and content. But once Facebook became the dominant player, they moved to make the exact tactics they used (Interoperability and automation) illegal. Talking about regulatory capture ...

> ineffective non-sequiturs like requiring them to ID everyone.

Is that really a non-sequitur though? Cigarettes are harmful and addictive so their sale is age gated. So too for alcohol. Gambling? Also yes. So wouldn't age gating social media be entirely consistent in that case?

Not that I'm necessarily in favor of it. I agree that various other regulations, particularly interoperability, would likely address at least some of the underlying concerns. But then I think it might not be such a bad idea to have all of the above rather than one or the other.


If I went to the store and asked for a pack of cigarettes, I show my ID (well, I would if I was carded, but I'm no longer carded :)) and the clerk looks at it, maybe scans it, then takes my money.

If I try to go to an adult website, or even just a discord server with adult content, I need to upload my ID. And now there's numerous third parties who now are looking at my ID, and I have no idea if I can trust them with my info. Indeed, I probably can't, given how many of them have already been breached.

Of all the people, PornHub actually has a pretty good write-up on this (1) (2), and they refer to "device-based" age verification, where you verify your identity once to say, Google or whoever. Then your device proves your age. Fewer middlemen. One source of truth.

I am not against age verification. I am against the surveillance state.

(1) https://www.pornhub.com/blog/age-verification-in-the-news

(2) https://www.xbiz.com/news/281228/opinion-why-device-based-ag...


> they refer to "device-based" age verification, where you verify your identity once to say, Google or whoever. Then your device proves your age. Fewer middlemen. One source of truth.

This is still an absurdity. You don't need the device to prove the age of the user to the service, you need the service to provide the age restriction of the content to the device. Then the device knows if the user is an adult or a kid and thereby knows whether to display the content, and you don't need Google to know that.


Major porn sites already send an RTA header. Social media could be required to do similar. However I think part of the concern here is that many parents don't bother to restrict things. So the question is if we want filtering similar to alcohol where minors aren't permitted to possess it, or similar to porn where the decision is left up to parents.

IRL

> If I went to the store and asked for a pack of cigarettes

online

> and I have no idea if I can trust them with my info

Why did you trust how your ID was scanned (if carded)?

With security cameras present, where did that scanned data end up?


nit: the Discord ID verification hasn't rolled out yet has it?

No, I believe it's next month though.

> Is that really a non-sequitur though?

You have something (human communication) which is not intrinsically harmful -- indeed it is intrinsically necessary -- but has been made harmful on purpose. That is very much unlike those other things, where the harm is in their very nature and isn't prevented by the provider just not being a schmuck on purpose.

That makes age gating a farce, because kids need to be able to communicate with other people, but you would end up in one of these scenarios, each of which is inane: 1) Providers all put up age restrictions and meaningfully enforce them and then teenagers are totally prohibited from communicating over the internet. 2) Providers all put up "age restrictions" which teenagers bypass in ten seconds and the whole thing is a pointless fraud. 3) You try to separate places for kids from places for adults, but then either a) Adults prefer adult spaces where they're not censored, so they congregate there and those spaces get the network effect, and then teens have to sneak in even if they're not looking for adult content because that's where the bulk of all content is, or b) Nobody likes to show ID even if they're an adult so adults congregate in the least restrictively moderated space where they don't have to show ID, and that space gets the network effect. Then to the extent that they censor, they're censoring the adults which is the thing that wasn't supposed to happen, and to the extent that they don't censor, you have a "kid space" that contains adult content.

It's a trash fire specifically because there's a network effect, which is an aggregating force causing adults and kids to be in the same space so they can communicate with each other. Then the space with the network effect would either have to censor the adults even though they can't leave because of the network effect, or not censor the adults and then have adult content in the space the kids have to be because of the network effect.

The way you fix this is not by trying to separate the kids ad adults into separate networks, it's by tagging specific content so the client device can choose not to display adult content if they're a kid. Which also solves the privacy issue because you don't have to provide any ID to the service when the choice of what content to display happens on the client and the service is only tasked with identifying the content.


> ... start with fuck these companies. Better the nanny state than Nanny Zuck.

I'm not sure how those two positions connect.

Execs bad, so laws requiring giving those execs everyone's IDs, instead of laws against twirled mustaches?


these are just bad arguments all around, including gov't with this upload id crap. Why aren't we making internet 18+? The only unrefutable answers I get are just downvotes which is ok I guess, sort of validates my point because there's no reason for kids to get unrestricted internet access and downvotes are easy.

How well would anything like that work in practice?

First of all, would we restrict all internet access, or just access to certain known sites and VPNs, letting everything else through because it's too insignificant even if it technically might merit being blocked for kids? I don't think a global internet block for minors is a good idea.

On wired internet, restricting access for devices that aren't clearly tied to individual users is problematic. Imposing age verification overhead on anyone who runs a network is unacceptable and unworkable. Locking non-mobile devices to individual users, in order to have mandatory software that blocks or sends age signals to the ISP, is also unacceptable and unworkable.

For mobile devices, maybe. There's a privacy problem if it's required for sim cards to be paid using credit cards, but if we do that, or if that's already effectively the case, I think it's fair that anyone who has an active credit card should be permitted on the "adult" internet. For multi-line accounts, we could make it a crime for the account holder to misrepresent age of the user of a line, i.e. to claim they're an adult when they're really a minor. Not very different from minors and cigarettes. It's not universally illegal for a parent to supply them, but it is in some places, and it should be.


I posted my plan forward but essentially kids get a whitelist at best. For example, a kid friendly access device allows a network connection to a vpn server certified safe for kids and then take it from there with whitelisted destinations. Blacklists are just whackamoles.

https://news.ycombinator.com/item?id=47122715#47131415


> I just like, can't help but start with fuck these companies. All other arguments are downstream of that. Better the nanny state than Nanny Zuck.

Wild times when we're seeing highest voted Hacker News commenters call for the nanny state.

If you're thinking these regulations will be limited to singular companies or platforms you don't use, there is no reason to believe that's true.

There was already outrage on Hacker News when Discord voluntarily introduced limited ID checks for certain features. The invitations to bring on the nanny state reverse course very quickly when people realize those regulations might impact the sites they use, too.

A lot of the comments I'm seeing assume that only Facebook or other platforms will be impacted, but there's now way that would be the case.


I don't even care about Discord adding ID verification to unlock certain features. Not going to give them my ID of course, just gonna use it as always. If they later tighten things to the point where it's unusable, sure, I'll quit Discord.

OK, here's another one.

How about taking all these websites that require PII onto their own members-only domain?

This actually should have been in place and well fleshed-out before Google & Microsoft started pushing their "account" nonsense.


>Better the nanny state than Nanny Zuck.

For me this is a crux, at least in principle. Once online media is so centralized... the from argument freedom is diminished.

There are differences between national government power and international oligopoly but... even that is starting to get complicated.

That said... This still leaves the problem in practice. We get decrees that age-restriction is mandatory. There will be bad compliance implementations. Privacy implications.

Meanwhile a while... how much will we actually gain when it comes to child protection.

You can come up will all sorts of examples proving "Facebook bad" but that doesn't mean these things are fixed when/if regulation actually comes into play.


Those execs were also using the tactics to addict adults, and while they may have targeted teens, the problem is, at its core: humans. So no amount of nannying by either the company nor the government will solve this issue.

Who would be responsible if a child developed alcohol addiction? A nicotine problem? Any other addiction?

Exactly. The same people that should be responsible for giving them unfettered access to an internet that is no longer safe. Even adults have to be wary of getting hooked on scrolling, and while I agree that the onus is on the companies, it has been demonstrated over and over again that they will not be held to account for their behavior.

So the only logical choice left that actually preserves freedom is for parents to get off their ass and keep their child safe. Parent's that don't use filtering and monitoring software with their children should be charged with neglect. They are for sending a kid into the cold without a coat, or letting them go hungry, why is it different sending them onto the internet?

And to your last point: You are dead wrong. No government anywhere in the world has demonstrated that they have the resources, expertise, or technical knowledge to solve this problem. The most famously successful attempt is the Chinese Great Firewall, which is breached routinely by folks. As soon as a government controls what speech you are allowed to consume, the next logical step for them is to restrict what speech you can say, because waging war on what people access will always fail. I mean, Facebook alone already contains tons of content that's against its terms of service, and they have more money than God, so either they actually want that content there, or they are too understaffed to deal with the volume, and the volume problem only ever increases.

So in my view, you are the one against freedom by advocating for the government to control the speech adults can access for the sake of "protecting the children" when the actual people that are socially, morally, and legally culpable for that protection are derelict in their duties.


> Who would be responsible if a child developed alcohol addiction? A nicotine problem? Any other addiction?

The government literally actively prevents people selling all these things to children, rather than permit a free for all and then expect parents to take responsibility for steering their kids away from them.


Meta for one has proven terminally irresponsible at acceptable stewardship.

Maybe it's about time that the proven predatory companies be restricted to something like their own adults-only internet cafes where age can be checked at the door.

They had their chance with the open internet and they blew it.


> Who would be responsible if a child developed alcohol addiction? A nicotine problem? Any other addiction?

I mean, historically speaking, we blamed the tobacco companies.


Did we? I know they lost some court cases, had to adjust advertising and so on, but was any tobacco company actually held accountable for the harm they caused? The answer is no because they all still exist and are profitable entities. Corporations that cause the harm they did should be subject to dissolution.

Also, if they were genuinely responsible, why can a child's parents be held accountable for them developing an addiction? The company was responsible, not the parent... do you see how ignorant that sounds?


The de jure minimium age to purchase tobacco is 21 now in the US, so I guess anyone see to sell tobacco products to those under that age could be held responsible as well.

They are held responsible by paying a fine to the government or losing their tobacco license, which is better than nothing, but doesn't actually fix the harm they caused already for the kid that's now hooked.

The legal system does nothing to fix the harm done by murder to the person who's now dead, either.

Social media is like tobacco. We went after tobacco for targeting kids, we should do the same to social media. Highly engineered addictive content is not unlike what was done to cigarettes.

Yes, go after Facebook and their kind only, avoid collateral damage to the remaining regular old internet.

No, it isn't. Tobacco is a physical substance that alters users' biochemistry and creates a physical dependence. Social media is information conveyed via a computing device. You can criticize social media for what it is in its own right, without having to engage in these kinds of disingenuous equivocations.

Sounds like you need to read up on dopamine and addictions a bit more.

Gambling isn’t introducing substance into user system it is making use of existing brain chemicals.

Social media companies engineered every piece of addictive mechanisms from gambling to alter brain chemistry or reactions of users.


> Sounds like you need to read up on dopamine and addictions a bit more.

Nah, I just need to not equivocate between them. The use of the same term to describe activities that produce a dopamine response as is used for ingestion of chemicals that create a direct physical dependence is little more than a propaganda tactic.


The comment said social media is addictive "like tobacco." Not that it's literally a drug.

You're blurring the lines a bit. Gambling isn't inherently an addiction. Just like a good TV show isn't inherently addictive either. Social media trying to be more engaging shouldn't really be viewed as an evil action anymore than HBO trying to create compelling content is.

The problem with comparing social media use to tobacco is that they are completely different. It's like saying weed is just like heroin because they both make you feel good. It's reductive and not productive.

The completely anti-social media stance ignores the good parts of social media. People can connect from across the planet and found others who shares the same views or experiences. People who are marginalized can find community where none may exist in their local area. So we should approach this more carefully and grounded.


Maybe this will make it more clear, so big difference is that people can connect across the planet without "big social media".

There are internet forums, chats, e-mail, blogs, there is no inherent need for "big social media" as we know. I do understand those companies made it much easier for average person to participate but still using internet forum or e-mail isn't exactly rocket science.

Here we are on HN, where no one is changing the layout and not doing much to drive engagement. Some days I don't even open any discussion because there is a lot of stuff that is not interesting for me.

"Big social media" companies had already multiple people speaking up explaining that they specifically made changes to drive engagement to hook people up and keep them scrolling without "creating compelling content". They specifically tuned feed algorithms to promote lowest common denominator trash content that makes people react in anger/frustration/whatever and not "creating/promoting compelling content".


So now you're demonstrating that you can criticize social media for its own flaws without having to conflate it with something else. I don't disagree with anything you're saying here, but nothing you're saying here involves attempting to equivocate social media with physical substance abuse.

Comparing internet forums, chatrooms, email, and blogs to Facebook and TikTok seems like a bad joke. I don't think you recognize how impactful "Big Social Media" is. Facebook brought about the ability to easily reconnect with people you had lost touch with and stay in touch with them. Things like Instagram made photo sharing and discovery significantly easier than simply looking at what the most recent posted photos on Photobucket. TikTok mass marketed bite sized videos and community trends. These things either did not happen on other platforms or could not happen on them.

I think most people remember the earlier days of Twitter where having a centralized place with strong discoverability led to unique communities forming and expressing themselves. I shouldn't need to say this but, it obviously wasn't all sunshine and rainbows. So I'm not saying these platforms were perfect or without major issues. I am say that their unique nature is not something that can be replicated via other mediums. It simply doesn't scale.

Honestly I'm not seeing the issue with these platforms wanting to maximize time users spend on them. That's the goal of every business. What seems to get lost though is self control. TikTok being fun and enjoyable does not mean that you are incapable of closing the app. It's like banning phones from leaving your house because you are so addicted to texting and apps. You cannot fully control what comes up on most social media. But as any therapist will tell you, all you can control is your response. I just think there is a space for big social media sites in the world. I don't even use them, but I can recognize the impact they have made with the good and the bad.


Nothing is inherently an addiction. You can smoke a cigarette without it being an addiction.

No, nicotine is actually addictive in that it creates physical dependency.

I don't think I implied that. Of course, but the ability to regulate usage is hampered by nicotine. That does not mean one cigarette and you're addicted though.

You can make the point that social media has real positive benefits as well as negatives without minimizing the well proven fact that gambling creates a form of addiction in a significant proportion, though not all, of its users, one every bit as devastating as heroin or alcohol.

Seems like you're overestimating how many people are addicted to gambling. Much in the same way those who are anti-alcohol will conflate responsible drinking with alcoholism. Gambling can be just as terrible, but it is different than heroin and alcoholism since it does not have a chemically addictive component. Reducing all addictions to being the same thing is damaging to addicts and addiction recovery. Much the same way reducing all crime to the same thing is for inmates of the prison system. You're removing nuance and difference which helps promote understanding.

May I introduce you to the delta-FosB gene?

https://en.wikipedia.org/wiki/FOSB#DeltaFosB


Can I ask what exactly you're intending to say? I'd rather try to guess what you're implying.

You’re right, it’s actually worse than tobacco. Tobacco simply makes your body sick, but social media attacks the most vital part of us. Even the CDC has studied this: https://www.cdc.gov/mmwr/volumes/73/su/su7304a3.htm

This is a normative cultural question, not a medical one. The CDC is far outside its expertise and its proper remit by involving itself in this topic.

the mechanisms by which that information is being conveyed have been shown to be addictive as well, no?

No, addiction involves physical substances interacting with a person's biochemistry. Attempting to extend the concept of addiction to include positive emotions brought on by sensory experiences or behavior is a disingenuous rhetorical tactic.

It's simply not legitimate to redefine "addiction" as anything that people might have an emotional or psychological motivation to participate in.

People trying to use the same terminology to describe social media as is used to describe tobacco or alcohol are trying to sneakily attach the negative associations of those substances to something unrelated entirely to them.

This is a form of deception, and a silly one, since social media has lots of negative aspects that can be argued against in their own right, without needing to engage in manipulative dialog.


Whether you like it or not, addiction is most commonly used as a general term to refer to any sort of compulsive behavior that acts against one's own self interest. Not your strawman of "anything that people might have an emotional or psychological motivation to participate in."

There are plenty of perfectly valid parallels between addiction to alcohol, gambling, porn, social media, junk food, etc. Are you denying that?

You can't just declare anyone comparing them to be disingenuous or disrespectful to those who are addicted. In fact what really seems disingenuous is the huge volume of this kind of pedantry in the thread by you and the same few accounts. Feels like misdirection away from the actual discussion about how to truly mitigate these addictions. Would appreciate your actual thoughts on this.


Comparing Tobacco to Social Media is like comparing me to LeBron James. I'd rather have my kid smoke a pack of day than have social media accounts


Screw over Meta then. Not everybody else.

Meta is the bozo in a panel van with no windows. All The legit porn sites put up Big Blinking Neon Signs.


I actually run an adults only community site and you are correct, I have it in a popup that appears on every "fresh" visit to the site, it's in the giant bold print you agree to when you register, and from a technical end, I send every possible header and other signal to let filtering software know it's an adult only space. If there is a child accessing that site, they are doing so because their parent didn't even attempt to prevent them from doing so. And now I'm having to look into ID verification services that are going to quintuple to costs of hosting this free community for people in a time where community is more important than ever.

Can't you just get people to email you an ID photo when they sign up?

Email is the digital equivalent of a postcard. I really want to argue that this is a bad idea (because it is), but depending how you set your email system up, it might actually compare favourably to using a third-party identity verification company.

Better to use some kind of secure drop web portal (perhaps https://securedrop.org/) that's actually designed for that kind of thing, though.


Honest question. Why do you care?

> Better the nanny state than Nanny Zuck.

why-not-both.jpg

Maximizing corporate freedom leads inevitably to corporate capture of government.

Opposing either government concentration of power alone or corporate concentration of power alone is doomed to failure. Only by opposing both is there any hope of achieving either.

Applying that principle to age-verification, which I think is inevitable: Prefer privacy-preserving decoupled age-verification services, where the service validates minimum age and presents a cryptographic token to the entity requiring age validation. Ideally, discourage entities from collecting hard identification by holding them accountable for data breaches; or since that's politically infeasible, model the service on PCI with fines for poor security.

The motivation for this regime is to prevent distribution services from holding identification data, reducing the information held by any single entity.


> Prefer privacy-preserving decoupled age-verification services, where the service validates minimum age and presents a cryptographic token to the entity requiring age validation.

This is the wrong implementation.

You require sites hosting adult content to send a header indicating what kind of content it is. Then the device can do what it wants with that information. A parent can then configure their child's device not to display it, without needing anybody to have an ID or expecting every government and lowest bidder to be able to implement the associated security correctly.

It doesn't matter what kind of cryptography you invent. They either won't use it to begin with or will shamelessly and with no accountability violate the invariants taken as hard requirements in your theoretical proof. If you have to show your ID to the lowest bidder, you're pwned, so use the system that doesn't have that.


This solves some probelms, such as children accessing porn sites (oh the horror). But it doesn't solve other problems, such as predators accessing children's spaces. YouTube Kids is purportedly a safe, limited place for kids - and yet, there are numerous disturbing videos that get past the automated censors. Pedophiles stalk places like Roblox.

> But it doesn't solve other problems, such as predators accessing children's spaces.

Neither do ID requirements. If you're purposely allowing in kids then you're allowing in everyone, because kids generally don't have ID.


Sure, but other forms of age verification requirements can, in principle, solve this (at the massive cost of many other privacy and compliance issues, as the article rightly points out). For example, periodic facial recognition-based age estimation can theoretically allow only kids' accounts to a certain space.

At which point you're still letting in every pedo who has a kid living with them or can grab one at a local school, and the child trafficking networks that by their nature have access to children or to cybercriminals who know how to fool the check with a fake camera, i.e. the worst of the worst.

Meanwhile you exclude the parent who is separated from their spouse and wants to check up on where their kid is hanging out when the kid is living with the other parent, and the investigative journalist who doesn't have a young kid or their kid is 16 but the detection system guesses they're 26.

And that's on top of having the lowest bidder building a biometrics database of children.


Then you do forensic and catch the predator instead of the age verification nonsense

Your proposed architecture also achieves the goal of discouraging content-distributing entities from holding hard identification data, so it sounds good to me.

> Better the nanny state than Nanny Zuck.

This is a huge self own. I can't believe I'm reading this on a website called "hacker news".


while i'm sympathetic to your position, the truth is that /that/ is where this site is now.

Why? Hackers need something to hack.

But you're right, 'twas a bit much.


>There is almost literally documented examples of Facebook executives twirling their mustaches wondering how they can get kids more addicted.

Then close their business. Age verification just makes their crimes even more annoying.


Yes, please close it!

Ah, oh, decision makers are shareholders themselves and are benefiting from this too.


> I just like, can't help but start with fuck these companies. All other arguments are downstream of that. Better the nanny state than Nanny Zuck.

How about we reject all institutional nannies?

It is much easier to implement user-controlled on-device settings than any sort of over-the-Internet verification scheme. Parents purchase their children's devices and can adjust those settings before giving it to their kids. This is the crux of the problem, and all other arguments are downstream of this.


My friends kids have access to his home servers. They don’t get to roam on the internet. It’s shocking to think parents might structure their child’s lives.

> I know this is weird, but I'm in some ways not really sure who is on the side of freedom here. I get your position, but like.

No one. You’ll see a few politicians and more individuals stuck to their principles, but anyone with major clout sees the writing on the wall and is simply working to entrench their power.

> Better the nanny state than Nanny Zuck.

Indeed, what lolberts fail to understand usually is not a choice between government vs “freedom” it’s a choice between the current government and whoever will fill up the power vacuum left by the government.


Don't conflate the Internet with Social Media. Social media is a service, just like FTP. The death of social media will not mean the death of the Internet. There's an argument that reducing social media use, by age verification or other means, will lead to a more free Internet due to reduced power of gatekeepers.

The problem is that internet is used nowadays for democratic purposes. Once you introduce a globally unique personal ID, you will be monitored. And boy, you will be monitored throughoutly. In case of any democratic process that needs to be undertaken in future against government, this very government will take the tools of identification and will knock to the doors of people who try to raise awareness and maybe mutiny. And this is what Orwell wrote about

Centralization and standardization are going to be the topic in the 21st century.

For all the complaining some U.S.-Americans seem to do about the EU approach to these issues, things like the Digital Markets Act aim to fix exactly these types of issues.

Heh, thank you.

I appreciate GPs point about giving “parents strong monitoring and restriction tools and empower them to protect their children”. That’s good. That acknowledges that we can and should give parents tools to deal with their kids and not let them fend for themselves (one nuclear family all alone) against the various algorithms, child group pressure, and so on.

But on the whole I’m tired of the road to serfdom framing on anything that regulates corporations.

Yes. Let’s be idealistic for a minute; the Internet was “supposed to” liberate us. Now we have to play Defense every damn day. And the best we have to offer is a false choice between nanny state and tech baron vulturism?

For a second just imagine. An Internet that empowers more than it enslaves. That makes us more equal. It’s difficult but you can try.


> documented examples of Facebook executives twirling their mustaches wondering how they can get kids more addicted

If you genuinely believe that this is about those moustache twirling executives, then I have a bridge to sell you.

Have you ever wondered why and how these systems are being implemented? Have you ever gone why Discord / Twitch / what have you and why now? Have you ever thought that this might be happening because of Nepal and the fears of another Arab spring?

https://www.aljazeera.com/news/2025/9/15/more-egalitarian-ho...

I think too many people on this platform don't understand what this is about. This is about power. It's not about what's good for you or the children. Or for the constituents. It's about power. Real power. Karp-ian "scare enemies and on occasion kill them" power.

There are many ways in which such a system could be implemented. They could have asked people to use a credit card. Adult entertainment services have been using this as a way to do tacit age verification for a very long time now. Or, they could have made a new zero-knowledge proof system. Or, ideally, they could have told the authorities to get bent. †

Tech is hardly the first industry to face significant (justifiable or unjustifiable) government backlash. I am hesitant to use them as examples as they're a net harm, whereas this is about preventing a societal net harm, but the fossil fuel and tobacco industries fought their governments for decades and straight up changed the political system to suit them. ††

FAANG are richer than they ever were. Even Discord can raise more and deploy more capital than most of the tobacco industry at the time. It's also a righteous cause. A cause most people can get behind (see: privacy as a selling point for Apple and the backlash to Ring). But they're not fighting this. They're leaning into it.

Let's take a look at what Discord asked people for a second, the face scan,

    If you choose Facial Age Estimation, you’ll be prompted to record a short video selfie of your face. The Facial Age Estimation technology runs entirely on your device in real time when you are performing the verification. That means that facial scans never leave your device, and Discord and vendors never receive it. We only get your age group.
Their specific ask is to try and get depth data by moving the phone back and forth. This is not just "take a selfie" – they're getting the user to move the device laterally to extract facial structure. The "face scan" (how is that defined??) never leaves the device, but that doesn't mean the biometric data isn't extracted and sent to their third-party supplier, k-Id.

There was an article that went viral for spoofing this, https://age-verifier.kibty.town/ // https://news.ycombinator.com/item?id=46982421 . In the article, the author found by examining the API response the system was sending,

    k-id, the age verification provider discord uses doesn't store or send your face to the server. instead, it sends a bunch of metadata about your face and general process details.
The author assumes that "this [approach] is good for your privacy." It's not. If you give me the depth data for a face, you've given me the fingerprint for that face.

We're anthropomorphising machines. A machine doesn't need pictures; "a bunch of metadata" will do just fine.

We are assuming that the surveillance state will require humans sitting in a shadow-y room going over pictures and videos. It won't. You can just use a bunch of vectors and a large multi-modal model instead. Servers are cheap and never need to eat or sleep.

Certain firms are already doing this for the US Gov, https://x.com/vxunderground/status/2024188446214963351 / https://xcancel.com/vxunderground/status/2024188446214963351

We can assume de facto that Discord is also doing profiling along vectors (presumably behavioral and demographic features) which that author described as,

    after some trial and error, we narrowed the checked part to the prediction arrays, which are outputs, primaryOutputs and raws.

    turns out, both outputs and primaryOutputs are generated from raws. basically, the raw numbers are mapped to age outputs, and then the outliers get removed with z-score (once for primaryOutputs and twice for outputs).
Discord plugs into games and allows people to share what they're doing with their friends. For example, Discord can automatically share which song a user is listening on Spotify with their friends (who can join in), the game they're playing, whether they're streaming on Twitch etc.

In general, Discord seems to have fairly reliable data about the other applications the user is running. Discord also has data about your voice and now your face.

Is some or all of this data being turned into features that are being fed to this third-party k-ID? https://www.k-id.com/

https://www.forbes.com/sites/mattgardner1/2024/06/25/k-id-cl...

https://www.techinasia.com/a16z-lightspeed-bet-singapore-par...

k-ID is (at first glance) extracting fairly similar data from Snapchat, Twitch etc. With ID documents added into the mix, this certainly seems like a very interesting global profiling dataset backstopped with government documentation as ground truth.

I'm sure that's totally unrelated. :)

-

† like they already have for algorithmic social media and profiling, https://www.newyorker.com/magazine/2024/10/14/silicon-valley...

Somehow there's tens to hundreds of millions available for crypto causes and algorithmic social media crusades, but there's none for the "existential threat" of age verification.

†† Once again, this is old hat. See also: Turbotax, https://www.propublica.org/article/inside-turbotax-20-year-f...


yep, the ol four horseman of internet censorship lol

if folks actually wanted to protect minors they would age restrict internet ACCESS instead of letting adults personal details get spewed all over the world for bad actors to take advantage of.


> I know this is weird, but I'm in some ways not really sure who is on the side of freedom here.

That’s because “freedom” is complicated and doesn’t precisely map to the interests of any of the major actors. Its largely a war between parties seeking control for different elites for different purposes.


Yes, seeking more control for themselves and completely at the expense of everybody else's loss.

You sound like someone that would work for the CIA or FBI if they offered you a job. Those are the types of people that I cannot and will not ever trust. I do not respect your opinion.

You don't have to, but hilariously - I would never work for the CIA or FBA (I mean I can't, I think they require a college degree) but the most paranoid conspiracy-theorist libertarian hacker I ever knew did, and said it was the best work of his life. Ironic?

I'm going to move off-grid and become a sovereign citizen.


I don't get your point, at least not in relation to the GP post. I agree with GP, parents need to be more accountable. We as parents, and We should all be concerned about future children/generations, should be demanding more regulation to help force the change we need on this topic. We as a society need to treat SM like those other addictive product classes. The fact SM is addicting and execs try to juice it more, is frankly to be expected.

Vilify them all you want, but same has been done with nicotine products, alcohol products, etc. and to GPs point, we SM as a toy for our children to play with. We chose to change the rules (laws, regulations, etc) because capitalists can never be simply trusted to do what's best for anything except their bottom line. That's a fundamental law no different than inertia or gravity in a capitalistic society. That's why regulators exist. Until you regulate it, they will wear their villain badge and rake in the billions. It's easy to be disliked when the topic of your disdain is what makes you filthy rich (in other words, they don't care what you or I think of what they're doing).


not social media, treat the entire internet as fundamentally hazardous to kids because it is, just like cigarettes alcohol and porn. check IDs once when signing the contract that is required for internet access and all these problems go away.

That’s fair. I do think they’re some point where it gets too broad though. The definition part will be a tough balancing.

As an example, I’ve been a fairly strict parent with devices and content access. But, I do let my son play Switch games with his friends which requires the internet. I feel it’s ok in moderation, he plays no more than about 3 hours a week.


>I get your position ... There is almost literally documented examples of Facebook executives twirling their mustaches wondering how they can get kids more addicted. This isn't a few bands

Their position was to compare it to alcohol, guns, and tobacco, not bands using naughty words. Alcohol and tobacco definitely enter mustache swirling territory, getting children addicted and funding misinformation on the harms of their product.


> There is almost literally documented examples of…

lol


> Better the nanny state than Nanny Zuck.

The state can imprison you. Zuck can't.


Yet! ;-)

Nobody is putting a gun to your head and forcing you to use Facebook or whatever other site. I quit using most social media over a decade ago. If you don't want to use it, or you don't want your children to use it, then don't use it.

O yeah? Where’s that guy who couldn’t get a job 6 months ago because he refuses to use LinkedIn.

I am a bit confused by that comment. Are parents social responsible to prevent companies from selling alcohol/guns/cigarettes to minors? If a company set up shop in a school and sold those things to minors during school breaks, who has the social responsibility to stop that?

when I was a kid in the early 90's, my state (and many others) banned cigarette vending machines since there was no way to prevent them being used by minors, unless they were inside a bar, where minors were already not allowed.

The problem is, doing the analogous action with the entire internet is a privacy nightmare. You didn't have to tell 7-11 every item you bought at every store in the past 2 years and opt-in to telling them what other stores you go to for the next 5.

There is no digital equivalent of "flash an ID card and be done with it" in the surveillance state era of the internet. Using a CC is the closest we have and even then you're giving data away.


The analogous action is to only require age-restricted sites (or parts of sites) to check ID, not the entire Internet. e.g. no one is calling for mathisfun.com to check ID. I'd expect most parts of the web are child-friendly and would not be affected. Just like how almost all locations in physical space don't need to check ID.

Additionally, the laws I've read mandate that no data be retained, so you have stronger legal protections than typical credit card use, or even giving your ID to a store clerk for age restricted purchases (many stores will scan it without asking, and in some states scanning is required).


This might have the benefit of reversing the trend where everything on the internet was rolled in to social media. If social media is age restricted, news, announcements, etc will have to break out to dedicated websites if they want to be accessible by all ages.

just ban kids from the internet already. if a parent allows the kid to have a full function smartphone and the kids get caught with it then throw the parents in jail and kids in an orphanage. people will catch on.

Anything that can take user input, including any forum, would get popped.

I don't see why that would be the case. It's reasonable to allow services that have a policy forbidding such content and make good faith efforts to moderate and remove it promptly. Seems analogous to e.g. a building being vandalized with lewd drawings. Or laws about user submitted child pornography.

I expect most forums or discussion groups in practice actually don't have child-inappropriate content, and already moderate such things because the members don't want it.


You do not need to control the entire internet. Put time limits on connected devices. Use parental controls. Talk to your kids about what they do online. Set clear boundaries. Reward good behaviour. Talk to other parents to align these limits to avoid social issues among the kids.

We may be agreeing, I'm saying there is no battle tested, privacy safe technical method of verifying age online, and this the controls need to be in the physical environment and setting social standards for social media and phone use.

I think the argument is more around it being illegal so as to not be forced into playing "the bad guy". It's hard to prevent a level of entitlement and resentment if those less well parented have full access. If nobody is allowed then there's no parental friction at all.

Its unfortunate that the application of this rule is being performed at the software level via ad-hoc age verification as opposed to the device level (e.g. smartphones themselves). However that might require the rigimirole of the state forcibly confiscating smartphones from minors or worrying nepalise outcomes.


I'm saying hold parent's accountable for their children's online behavior and for their protection online, not companies (who want to profit off the kids, perverse incentive) or governments (who can barely be trusted to do this even if this was the only goal). For example if your kid starts making revenge CP of their classmates, and the parent could have reasonably mitigated or known about it, I think the parent absolutely should be held responsible.

Don't punish the rest of the web for crappy parenting and crappy incentives by companies/govts.


If we want parents to be accountable, then these platforms need to provide better tools to enable parents to do so. It is impossible to monitor the entirety of your child's behavior online through any of these platforms today. They are their own person, they make their own choices, and those choices are heavily influenced by a world the parents have increasingly less influence over, especially as they grow older.

On the flip side, I do think we should also hold companies more accountable for this. We collectively prevented companies from advertising tobacco to minors through regulation with a pretty massive success rate. These companies know how harmful social media can be on youth, and there is little to no effective regulation around how children learn about these platforms and get enticed into them.


I do not disagree with any of this, I was hoping it was implied by my original comment that this would be necessary.

This all needs to be modulated by the knowledge that some children benefit immensely from being able to hide parts of their lives from their parents, parts that their parents would disagree with greatly.

The clearest example is LGBTQ kids who want to talk to other LGBTQ kids, or enjoy LGBTQ content, without fundamentalist or just homophobic/transphobic parents finding out. Children of fundamentalist or cult members who want an escape from the cult are another common category.


I’m embarrassed to admit this hadn’t even occurred to me until I read your comment.

> I'm saying hold parent's accountable for their children's online behavior and for their protection online

You're saying the status quo and I think its fair to state you wouldn't intentionally design the status quo. Unless we have some wizard wheeze where we can easily arrest and detain or otherwise effectively punish parents without further reducing the quality of life for their children.


But it's not playing the bad guy. It's playing the good guy.

in the abstract but in the social of the home you have to be the bad guy. While good parents manage that, the bar is too high for society in general.

The bar isn't that high at all. It's just what norms you decide to set. You could make this argument for any particular parenting decision, from washing hands before food to saying no to the next desired purchase. It doesn't make sense to special-case this. At some point you're setting rules, and it's not that difficult. Just don't buy the device.

sorry but I feel like your standards are a bit higher than the average here. The bar is, and has to be extremely low, to hope for compliance.

This isn't a compliance based exercise. You can achieve compliance by setting no rules.

From the perspective of the kids you are the bad guy.

Parents can't easily prevent their kids from going to those kinds of stores once they're at the age where the parent doesn't need to keep an eye on them all the time and they can travel about on their own.

The difference though is that parents are generally the ones to give their kids their phones and devices. These devices could send headers to websites saying "I'm a kid" -- but this system doesn't exist, and parents apparently don't use existing parental controls properly or at all.


> These devices could send headers to websites saying "I'm a kid" -- but this system doesn't exist

And there would be ways to work around it. If people find that privacy-preserving age verification is not good enough because "some kids will work around it", then nothing is good enough, period. Some will always work around anything.


if a parent gives a kid a full on smartphone, charge the parent with child abuse just like feeding the kid alcohol, cigarettes or having sex with them. people will catch on.

Or people who aren't parents are yet again sharing strong opinions that are not based in reality. Plenty of parental controls are deployed, how long they last against a determined child is the real question. Here's a concrete example for you. Spotify has a web browser built in so that you can watch music videos, kids have figured out a way to use that to watch any video on YouTube--a 12 year old told me this. If you search on this subject you'll quickly learn this is well known and is generally being ignored by Spotify. Why not allow parents to disable the in-app web browser / video function?

It's not as easy as you may believe to prevent that type of access.


So what’s the alternative? Pretend we don’t live in a digitally connected society and set our kids up for failure when they get one years after their peers?

Let's assume for the sake of argument that social media is extremely harmful to children. Which means the answer to your question is "yes, obviously". If people were running around giving their kids fentanyl, you wouldn't say "but my kid's friends all use fentanyl and he'll be an outcast if he can't". You would say "any friends that he loses over this are well worth avoiding the damage". Why would it be different just because it's social media?

Phones, I mean. Sorry for the confusion there. I’m for holding off on social media.

Keeping your kids off social media is setting them up for success.

I’m talking about phones specifically. Agree re: social media.

The problem isn't with phones. We should have robust parental controls and the responsibility of parenting should be left to, wait for it... the parents.

The person before me is the one who brought up phones.

> The difference though is that parents are generally the ones to give their kids their phones and devices.

But either way I disagree. This comment sums up my point: https://news.ycombinator.com/item?id=47122715#47128105


ISPs and OSs should be the ones providing these tools and make is stupid easy to set up a child's account and have a walled garden for kids to use.

I live in the UK. By default your ISP will block "mature" content and you have to contact them to opt out. iOS, Android, Playstation, Xbox, Switch all have parental controls that are enforced at an account level.

A child with an iPhone, Xbox, and a Windows Laptop won't be able to install discord unless the parent explicitly lets them, or opts out of all the parental controls those platforms have to offer.

The tech is here already, this is not about keeping children safe.


You have to be very tech savvy to know that your kid asking to install Discord to talk to/play games with their friend group is as dangerous as it is.

A single google search will tell you pretty unanimously that discord isn’t for kids, is rated 13+ and has risks of talking to strangers.

Parts of discord are not safe at all for 13 year olds and currently there isn't a mechanism as far as I am aware to restrict a 13 year old from accessing them.

The solution to that is obviously some sort of Parental features, where a parent can create accounts for their kids with restricted access and/or monitoring capabilities. The solution isn't to require an ID from everyone just to "protect the kids"...

No, it's about corporate and government control. Thankfully, the UK government is clueless about tech, which means these controls can be bypassed relatively easily by using your own DNS or a public DNS server like Quad9.

The corporations in this case are fighting against this. This is about your government and its desire to squash opinions they don't like. They are already going so far as to jail people for posting opinions they don't like. This has absolutely nothing to do with children, children are just the excuse.

There's a law going through in some state that want's to do this, but also put the onus on the OS developers to detect age aligned behavior. How do you do this with Linux? It would kill the open computer and kill ownership over computing.

Why would it be a problem to do this sort of thing with linux? Linux allows for oauth, proxied networking, what have you -- unless they're using some super-secret-unpublished-protocol, linux will be fine

I'm against these age-verification laws, but to say it's impossible to comply with open-source software isn't really true.


The point is that you won't be able to just install a Linux distro of your choice in this world - your computer will only run approved OSs that have gone through some kind of certification process to make sure they enforce age-verification content. If, say, the Debian foundation doesn't want to add these mandatory controls because they feel it goes against the spirit of Debian (not to mention the huge issues with the GPL), then your new computer just won't be able to run Debian anymore. And something like Kali would be right out, of course, since anonymity is not compatible with age verification.

Or, Conversely, these systems won't be able to verify age and will just be shut out of adult content. Which is fine, just keep a windows machine around for porn and do your actual work on a real computer

Mark Zuckerberg advocates for this, most people entrenched in this argument think it's worse. But I'm all for burning it to the ground so.

You must not have kids if you think it's easy to keep children off things that are bad for them.

[Any] task is much easier if you have the tools. Do/did you have a baby monitor? A technological tool, that allows you to "monitor" the baby while not being within an arms reach.

Do you have an A+++++ oven with three panes of glass? It's [relatively] safe to touch and instead of monitoring if a child is somewhere near the oven you have to monitor if the child does not actively open the oven. That's much easier.


Dumb question, not a parent — how old does a child have to be before they'll only touch the hot thing once so you don't need to guard it?

They learn to not intentionally touch the hot thing between 1 to 2 years, but then they still can fall on it or hit it during play.

As a parent you quickly learn that when you don't actively prevent major accidents it ends up costing you much more time, stress, screaming, etc.


It's really not some Herculean task to do so either, though.

I remember how my sister and I set up Google Family and fully locked down my niece her phone with app restrictions, screen time restrictions and a policy of accountability when we need to extend the screen time.

It worked really well up until she got a school managed chromebook for homework with no access controls.


Can't your router block by Mac address? Just limit the Chromebook to allowlisted sites. And also school-issued computers are known for Spyware and even worse. It should probably be segregated in a separate network or vlan.

Maybe you don't have kids of your own. Once you have 2 or 3, it is quite challenging to manage everything, especially over time.

Especially if they are older, like 8+ years old. They are resourceful, sneaky and relentless.

Which is exactly why all people everywhere giving up their privacy will also be ineffective.

Drugs, alcohol, cigarettes, pornography were all illegal for me to access as a kid but I wouldn’t have had any trouble getting any of it.


Maybe at 16, not at 8.

Many of my school colleagues started smoking around 10-11 years old. All of us had tasted alchol by then, and some of them were definitely drinking the occasional beer. Older kids sometimes brought porn magazines in school and would show younger kids too (still talking about pre-highscool here). Now, this was childhood in Romania in the 1990s and early 2000s, soon after the fall of communsim, so maybe not so applicable everywhere else, but still - I doubt that there is any problem for a resourceful 8-10 year old even today to get some of these things.

There’s a difference between “saw a playboy once” and having regular or semi regular access to it.

Same goes for alcohol and cigarettes.

In the US, if you had regular access to those things, you had parents who didn’t care.

It’s also not about kids on the margin. The vast majority of 8 year olds in the US have not tried alcohol, drugs, or cigarettes.

I can’t rely speak to post Cold War Romania.


The older kids are often the easy source for the younger kids. At 8 I had already seen a Playboy and knew kids who had seen harder stuff. I could have easily gotten a teenager to get me cigarettes (and drugs, but I didn’t know what those were really). I had also already tasted alcohol. Any of this I could have stolen from any number of places.

At 16 it was easier, but at 8 it wasn’t hard.


There’s a difference between “saw a playboy once” and having regular or semi regular access to it.

Same goes for alcohol and cigarettes.

If you had regular access to those things you had parents who didn’t care.

It’s also not about kids on the margin. The vast majority of 8 year olds have not tried alcohol, drugs, or cigarettes.


There’s also a difference between “saw my first” and “saw a playboy once.” I need you to understand I was a good kid whose parents cared until they divorced some years later. And yet I had multiple sources of access to this stuff without looking for it. Now, as an adult, I can see more ways I could have gotten it if I wanted it.

Again, if you occasionally caught a glimpse of a playboy, that’s not a significant problem.

If you were regularly smoking cigarettes, drinking alcohol, and reading porn magazines at 8 yeas, your parents fell down on the job. An 8 year old doesn’t have the wherewithal to hide that from parents who are paying attention.

> Now, as an adult, I can see more ways I could have gotten it if I wanted it.

Yeah a kid with the mind of an adult could access all kinds of illegal material.

Making it illegal to rob a bank doesn’t mean that’s it’s literally impossible. It’s about stopping enough people from trying that society functions.

The state of the world before the internet was that it was hard to keep a kid from ever glimpsing a titty, but it was relatively easy to keep a kid from having regular access to hard core porn-much, much easier than it is now. My take is that as a society we need to figure out some way to make this easy enough for parents to do that it becomes the default. Just like drugs, alcohol, and porno mags.

Another issue is that online porn and algorithmic brain rot is free (at least enough of it is). With IRL contraband, lack of money is a big limiting factor for kids. The IRL equivalent would be if the local liberal let 8 year olds checkout hard core porn DVDs.


Yeah. Anyway, porn, cigarettes, alcohol, and drugs were very accessible to me despite being a good kid with parents who cared in a world where those were all legally forbidden to me.

All this talk of “glimpses” is you trying to read too deep into a single example.

I’m not using my adult mind to figure out how I could have gotten this stuff as a kid. I’m using my adult mind to recognize that if I had been motivated as a kid, there are additional ways I. as a kid, would have been able to figure out how to get it.

I’m not throwing my hands up in the air and saying this is impossible or that we should just open up access. I’m saying requiring ID for access wasn’t effective before and it won’t be effective in a world with easier access. Yet the cost of that is quite high. Scan these threads for actual ideas, I’m not arguing for any particular one but there are plenty of them and some I think are good.


>Yeah. Anyway, porn, cigarettes, alcohol, and drugs were very accessible to me despite being a good kid with parents who cared in a world where those were all legally forbidden to me.

Were they accessible to you, or do you just think they were accessible to you? How many of these teenagers who would let you try a cigarette would have been willing to keep supplying you cigarettes regularly. How many would have been willing to keep buying you alcohol?

>All this talk of “glimpses” is you trying to read too deep into a single example.

No, it's glimpses, because it's about at the very least semi-regular access, not preventing every single child from having tiny amounts of alcohol. Look at my reply the other poster in this thread. There are dozens of studies that show conclusively that minimum age drinking laws reduce alcohol use among children, and reduce alcoholism later in life.

>I’m saying requiring ID for access wasn’t effective before

But yes it was effective. Read the studies. Minimum age drinking laws have been shown almost universally to be effective. Not at stopping every child from drinking but at harm reduction.

>I’m using my adult mind to recognize that if I had been motivated as a kid, there are additional ways I. as a kid, would have been able to figure out how to get it.

The level an effort an 8 year old would have to go through to get regular access to cigarettes and alcohol in the US, would require an enormous level of motivation which almost no 8 year old has, and it would be outright impossible to do without a semi-observant parent noticing.

That's the whole point of making it hard to do.

It takes much less effort for a kid to walk to the library and check out a hardcore porn DVD than it does for him to convince an 18 year old to buy one for him. Most kids just aren't going to go through the hassle of doing the latter, but they'd do the former in a heartbeat. All things being equal, greater motivation is required to overcome greater obstacles.


you are writing this as if you were never a kid yourself... there is absolutely nothing I wasn't able to "get" as a kid - some stuff I had to jump through some hoops but end-result would always end up being the same. if I wanted to watch hardcore porn, there was a way, if I wanted to smoke a cigarette, there was a way. if I wanted to drink, there was a way. and make it "forbidden" made it ever more appealing for me to get it as a kid. I grew up in society where alcohol was not a big deal, I was buying alcohol for my parents when I was 6-years old, would get sent to the store to get stuff and among the stuff was always beer and sometimes wine if my parents were expecting some guests. most of my friends growing up never thought of alcohol as something cool, we had easy access to it so it was like a rights of passage or anything like that and it showed, just about no one was doing any drinking while we were teenagers. when I came to america junior year of high school I was stunned at home much effort my schoolmates were making to acquire alcohol - could not really understand what the big deal is until I realized that was because it was forbidden and acquiring beer etc for a friday evening chill made one a cool kid.

the only barrier I have ever had to doing stupid things was the wrath of my parents. the punishment(s) levied when I did stupid shit was always such that I would very seldom-to-never-again consider doing whatever stupid shit I did. it always starts and ends with parents. you can put in whatever "laws" you want (which will always get weaponized politically at some point either immediately or at a later time) but end of the day the buck starts and stops with parents...


1. There is no scientific evidence that the "forbidden fruit" theory is correct. Studies of minimum drinking ages show a near universal reduction in drunk driving deaths, alcoholism, and crime rates.

https://pmc.ncbi.nlm.nih.gov/articles/PMC3018854/ https://pmc.ncbi.nlm.nih.gov/articles/PMC3586293/ https://pmc.ncbi.nlm.nih.gov/articles/PMC4961607/ https://www.nytimes.com/roomfordebate/2015/02/10/you-must-be... https://www.cdc.gov/alcohol/underage-drinking/minimum-legal-...

If you care to google it there are dozens of additional studies that all say the same thing.

2. You're writing this as if you don't understand what it's like growing up in a country where 8 year olds don't have easy access to alcohol, cigarettes, and drugs.

And you're writing this as if you don't understand what it's like growing up was a kid growing up in America specifically. My young children and the young children of everyone I now could not regularly drink alcohol or smoke cigarettes without their parents knowing about it. When I was 8 I couldn't have done either regularly without my parents knowing about it.

Again this isn't about stopping every single kid in the world from ever trying alcohol. This is about making it harder for them get and easier for parents to enforce.

>end of the day the buck starts and stops with parents...

That's a completely unrealistic view of the world and it's just flat out wrong on the face of it because every study we have on the subject shows that minimum drink age laws reduce harm--they work. If it were solely up to the parent they wouldn't work.

The easier you make it for parents to do the right thing, the more of them will do it.


over 10 years ago, I had an intern from Harvard CS tell me that privacy is irrelevant unless you're doing something that you want to hide. I was gobsmacked that someone would not cherish their privacy but since then I've realized many don't care at all and have the same attitude that "I don't have anything to hide."

Well that's your mistake right there. You hired someone from Harvard. Unless you are hiring that person to use their connections to market your product, there is no reason to hire someone from Harvard. They just bring bad ideology and STDs from Russian hookers to the table and nobody wants that.

PS This post is partly satire, I will leave it to you as to which part is serious.


> They are resourceful, sneaky and relentless.

... and honest:

- they will honestly tell you that they'd be very happy to see you dead when you impose restrictions upon them (people who are older will of course possibly get into legal trouble for such a statement)

- they will tell they they wish you'd never have given birth to them (or aborted them)

- they will tell you that since they never wanted to be born, they owe you nothing

- ...


Sounds like a kid in need of psychiatric help.

You barely ever had to deal with pubescent children? :-)

I raised kids. Never had to deal with anything like what is described. Sounds like someone read some questionable books on parenting, unfortunately followed the bad advice in those books and this is the result.

And this entire thing is about bad parenting. Its always easier to just give the kid a tablet and go back to whatever you were doing. Its always better to actually interact with the kid. That trade-off of time is important because if you mess up when they are young, you spend a lot more time handling issues later on. That time you gained by giving them a tablet will get payed back someday, usually with interest. That's what is happening here.


Please get the kids some help before we have to send you thoughts and prayers

I mean, that's really not normal puberty stuff, but... okay.

As a father of 3, one thing the wife and I had to learn over the course of the first two is that the modern world holds parents to impossible standards and a "fuck off" attitude is required for much of it.

We've had pediatricians shame us for feeding our kids what they're willing to eat and not magically forcing "a more varied diet" down their throats at every meal, despite them being perfectly healthy by every objective metric. There are laws making it technically illegal for us to leave our kids unsupervised at home for any period of time in any condition, even a few minutes if one of us is running slightly late from work/appointments.

Your not-quite-2-year-old is too tall for a rear-facing car-seat? You're a bad parent, possibly a criminal and putting them at risk by flipping the seat to face forward, a responsible parent spends hundreds of dollars they don't have on several different seats to maybe find one that fits better or have their kid ride uncomfortably and arguably unsafely with their legs hyper-extended up the seatback.

Miss a flu shot because you were busy? Careful you don't come off as an antivaxxer.

And all of this and more on top of changing diapers, doctors' appointments, daycare, preschool, school, family activities and full time jobs?

Yeah, when my kids are old enough to engage with social media I will teach them how to use it responsibly, warn them about the dangers, make myself available to them if they have any problems, enforce putting the phones down at dinner and and keep a loose eye on their usage. Fortunately/unfortunately for them they have a technically sophisticated father who knows how to log web activity on the family router without their knowledge. So if anything goes sideways I'll have some hard information to look at. Most families don't have that level of technical skill.


I was almost certainly never going to be a parent for other unrelated reasons, but you have just given me a whole other list of confirmations for that decision that I hadn't thought of before.

Thank you for that.


Well it's all more than worth it, at least for us. But that doesn't make some of the excess judgement tedious to deal with.

Kids are great at forcing you to prioritize. All of a sudden pre-ground coffee is worth it.


The school, in loco parentis.

Not responsible for selling to all minors, just theirs.

Well the parents entrust their kids to the school, so they would be the ones responsible for what goes on on their premises. In turn, school computers are famously locked down to the point of being absolutely useless.

That's really a district-by-district / school-by-school thing, some are significantly more locked down than others

Companies are legally prohibited from marketing and selling certain products like tobacco and alcohol because they historically tried to.

Parents are legally and socially expected to keep their kids away from tobacco and alcohol. You're breaking legal and social convention if you allow your kids to access dangerous drugs.

Capitalist social media is exactly as dangerous as alcohol and tobacco. Somebody should be held responsible for that, and the legal and social framework we already have for dealing with people who want to get kids addicted to shit works fairly well.


So we should ban social media is what you're saying but not what OC is saying.

Banning access to social media for kids under 18 similar to how tobacco and alcohol is banned to underage people would be the more direct line.

This argument is quite close to what gov'ts are "trying" to do here! And I tihnk you'll find very few people ammenable to the idea that we should allow cigarettes to be sold to underaged people (even if in practice they still get access).

The argument on the "don't do the social media ban" side is quite an uphill battle if you dig into this metaphor too much


All "social media" that uses recommendation algorithms should be unavaliable to children.

At least give it a try.

"Capitalist social media is exactly as dangerous as alcohol and tobacco. Somebody should be held responsible for that, and the legal and social framework we already have for dealing with people who want to get kids addicted to shit works fairly well."

They work hand in hand with governments around the world, that's why they get the tax breaks. In return they hand over details about your opinions, social networks and whereabouts, not to mention facial recognition data via Facebook. They aren't remotely capitalist in any real sense since they have a bad business model.


> Capitalist social media is exactly as dangerous as alcohol and tobacco.

Most actual studies done on this topic find very little evidence this is true.

It's a run-of-the-mill moral panic. People breathlessly repeating memes about whatever "kids these days" are up to and how horrible it is, as adults have done for thousands of years.

I expect some emotional attacks in response for questioning the big panic of the day, but before you do so please explore:

[1] Effects of reducing social media use are small and inconsistent: https://www.sciencedirect.com/science/article/pii/S266656032...

[2] Belief in "Social media addiction" is wholly explained by media framing and not an actual addiction: https://www.nature.com/articles/s41598-025-27053-2

[3] No causal link between time spent on social media and mental health harm: https://www.theguardian.com/media/2026/jan/14/social-media-t...

[4] The Flawed Evidence Behind Jonathan Haidt's Panic Farming: https://reason.com/2023/03/29/the-statistically-flawed-evide...


> give parents strong monitoring and restriction tools

The problem is that it's bloody hard to actually do this. I'm in a war with my 7yo about youtube; the terms of engagement are, I can block it however I want from the network side, and if he can get around it, he can watch.

Well, after many successful months of DNS block, he discovered proxies. After blocking enough of those to dissuade him, he discovered Firefox DNS-over-HTTPS, making it basically impossible to block him without blocking every Cloudflare IP or something. Would love to be wrong about that, but it seems like even just blocking a site is basically impossible without putting nanny-ware right on his machine; and that's only a bootable Linux USB stick away from being removed unless I lock down the BIOS and all that, and at that point it's not his computer and the rules of engagement have been voided.

For now I'm just using "policy" to stop him, but IMO the tools that parents have are weak unless you just want your kid to be an iPad user and never learn how a computer works at all.


As a parent of young children, this is your entire problem:

> the terms of engagement are, I can block it however I want from the network side, and if he can get around it, he can watch.

You're treating this as a technical problem, not a parental rules problem. Your own rules say he's allowed to watch!

You have to set the expectations and enforce it as a parent.


Depends on what the goal is. But yeah I agree if you really don't want them on YouTube (or whatever) and really do want them to tinker with their devices then you're likely going to have to eschew technical measures for more overt ones.

I think the point is that it's not enforceable.

I remember when I was a kid that age there were rules and some were technically enforced. But if you found a way around the technical enforcement you were in huge trouble. The equivalent here would he been, if you used a proxy to watch what you weren't meant to, then you lose all screen time indefinitely. Sneaking around parents' rules was absolutely not on.

Sounds like a smart kid, is part of you secretly proud of him for his tenacity?

Is it impractical to keep an eye on what he's doing on his computer, i.e. physically checking in on him from time to time?

How about holding him responsible for his own behavior, to develop respect for the rules you impose? Is it just hopeless, and if so how come? Is it impossible for him to understand why you don't want him watching certain content or why he should care about being worthy of your trust?

I'm not judging here, I'm genuinely curious.


Personally I wouldn't want to expose a child to "the algorithm" ie recommendations. It turns up useful stuff but (IMO) the stream contains an unacceptable concentration of radioactive waste and becomes increasingly concentrated if you click on any of it.

I might suggest explaining this to him, providing a uBlock filter to sanitize the page, and requiring use of said filter.


The obvious solution would be TLS interception and protocol whitelisting. Same as corporate IT. Stick the kids' devices on a separate vLAN if you don't want to catch all the other devices in the crossfire.

Still, there's an awful lot of excellent educational content on YouTube. It seems unfortunate to block access to that. Have you considered self hosting an alternative frontend for it?


At this point why not just emancipate him. Hook him up with an easy remote job, put a lock on his bedroom and hand him the keys, and make him start paying rent. Because I’m having trouble figuring out what part of society you’re preparing him for at this stage. Respectfully.

Putting controls on the machine you want to restrict is pretty normal. While I agree with your first sentence that it's hard for parents to get proper monitoring tools, the rest of this sounds like a self-imposed problem. If you don't want to mess with the actual machine then run a proxy it has to use.

Whats so hard about taking the iPad out of they're hands? or laptop or whatever, once you catch them on sites they shouldnt be on?

You're understating the US's policy on recklessness. We have "attractive nuisances," which means that if you put a trampoline in your backyard, and a kid passing through sees it, decides to do a sick jump off of it, and breaks their leg, that was partly your fault for having something so awesome that kids would probably like.

> which means that if you put a trampoline in your backyard, and a kid passing through sees it, decides to do a sick jump off of it, and breaks their leg, that was partly your fault for having something so awesome that kids would probably like.

That's not exactly accurate. The two key parts of the attractive nuisance law are a failure to secure something combined with the victim being too young to understand the risks.

So if you put a trampoline in your front yard, that's an easy attractive nuisance case.

If you put a pool in your back yard with a fence and a locked gate, it would be much harder to argue that it was an attractive nuisance.

If a 17 year old kid comes along and breaks into your back yard by hopping a 6-foot tall fence, you'd also have a hard time knowing they didn't understand that their activities came with some risk. Most cases are about very young children, though there are exceptions


>put a trampoline in your front yard

This is exactly what one of our neighbors did when I was growing up.

All the kids loved it.

There just weren't very many lawsuits back then like there are now after the number of attorneys proliferated so much.

To be as safe as they could, the parents put the trampoline in a pit where the bouncing surface was at ground level.

If you drove by, you wouldn't even be able to see it, or have any idea that it was there.

Unless there was somebody bouncing at the time.

You should have seen the look on peoples' faces when they drove down our street and saw that for the first time :)


It would not be quite that simple. The trampoline (or pool, or whatever) would have to be visible, in a place children were likely to be, not protected by any reasonable amount of care, and the kid would have to be young enough to not know any better.

The legal doctrine is also not specific to the US, of course.


And that law is incredibly and hideously stupid, as it's a heckler's veto on having cool stuff.

The Internet is basically the final frontier where this harmful law doesn't reach, though the Karens are really trying to expand their power there.


A monitoring solution might have worked for my case if my parents had monitored my Internet history, if they always made sure to check in on what I thought/felt from what I watched and made sure I felt secure in relying on them to back me up in the worst cases.

But I didn't have emotionally mature parents, and I'm sure so many children growing up now don't either. They're going to read arguments like these and say they're already doing enough. Maybe they truly believe they are, even if they're mistaken. Or maybe they won't read arguments like these at all. Parenting methods are diverse but smartphones are ubiquitous.

So yes, I agree that parents need to be held accountable, but I'm torn on if the legal avenue is feasible compared to the cultural one. Children also need more social support if they can't rely on their parents like in my case, or tech is going to eat them alive. Social solutions/public works are kind of boring compared to technology solutions, but society has been around longer than smartphones.


Should the state have force your parents to give you up for adoption? That's the social support the state can offer.

This is the real point that needs to be made.

You can argue that many parents are less than ideal parents, but that is not sufficient to justify having the state step in. You also have to show that the state is less bad.

Decades of data on the foster system strongly suggests otherwise. The state, by any objective measure, is terrible at raising children.


I don't think it would have helped, given the outcomes for foster children are near universally worse except in the most extreme cases of abuse. I did threaten to call CPS but I was, of course, berated for it and threatened that I would be taken away, so that shut me up. Since I was never assaulted I doubt it would have reached the standard for foster care anyway, yet the consequences still endure to this day.

I was told over and over by in hindsight unqualified persons that emotional abuse wasn't real abuse, so after a few years I was disinclined to seek help.

If I had had even one person that supported me unconditionally instead of none at all, even if that person wasn't a parent, I'm fairly certain I would have turned out differently. That was just a matter of luck, and I came out empty-handed. I never felt comfortable talking about what I was exposed to online with anyone, and that only hurt me further, but I was a child and couldn't see another option.


So the only options are no support or give you up for adoption? No middle ground is possible?

As a parent, I think you’re understating how difficult it is to provide a specific amount of internet access (and no more) to a motivated kid. Kids research and trade parental control exploits, and schools issue devices with weak controls whether parents like it or not. I’m way at the extreme end of trying to control access (other than parents who don’t allow any device usage at all) and it has been one loophole after another.

As somebody who is entirely for restrictions on internet / social media, I think you're missing the bigger picture here. First, you assume that parents have the technical knowhow to restrict their kids from specific sites. My parents used a lot of different tools when I was a kid, but between figuring out passwords, putting my fingerprint onto my mom's phone, and spoofing mac addresses, I always found a way around the restrictions so I could stay up later.

But let's assume the majority of parents can actually do this. The problem with social media is not an individual one! We've fallen into a Nash Equilibrium, a game theory trap where we all defect and use our phones. If you don't have a phone or social media nowadays you will have much more trouble socializing than those who do, even though everyone would be better off if nobody used phones. As a teenager, you don't want to be the only one without a phone or social media. And so I truly do think the only solution is with higher level coordination.

Now, it's possible that the government isn't the right organization to enforce this coordination. Unfortunately, we don't really have any other forms of community that work for this. People already get mad at HOA's for making them trim their lawn; imagine an HOA for blocking social media! I do think the idea of a community doing this would be great though, assuming (obviously) that it was easy to move on and out of, as well as local. This would also help adults!

So to be honest, I don't think parents have the individual power to fix this, even with their kids.


It's much easier to give individual users control over their own device than to give a centralized authority control over what happens on everyone's device over the Internet. Local user-controlled toggles are just easier to implement.

All parental moderation mechanisms can and should be implemented as opt-in on-device settings. What governments need to do is pressure companies to implement those on-device settings. And what we can do as open-source developers is beat them to the punch. Each parent will decide whether or not to use them. Some people will, some won't. It's not Bob's responsibility to parent Charlie's children. Bob and Charlie must parent their own children.

To the people arguing that parents are too dumb to control their children's tech usage because they themselves are tech-illiterate: millennia ago, we invented this new thing called fire. Most people were also "too dumb" to keep their children away from the shiny flames. People didn't know what it was or how dangerous it could be. So the tribe leader (who, by the way, gropes your children) proposed a solution: centralize control of all the fire. Only the tribe leader gets to use it to cook. Everyone else just needs to listen to him. Remember, it's all for you and your children's safety.


> we invented this new thing called fire [...] So the tribe leader (who, by the way, gropes your children) proposed a solution: centralize control of all the fire

Of all the things, a "save-the-children prolegomena to the Prometheus myth" certainly wasn't on my bingo card today. So thank you for that, but I'm not aware of any reports of fire-keeping in the way you've described. Societies and religions do have sacred traditions related to fire (like Zoroastrians) but that doesn't come with restrictions on practical use AFAIK.


I'll spell it out for you since you can't read between the lines. It's not actually about fire-keeping in tribes to protect children. It's about certain people (governments, corporations, organizations) wanting control over the Internet and everyone's digital communications. They don't want a free marketplace of ideas and uncensored channels of communication because their propaganda narratives would not survive.

The tribe leader refers to certain rich and powerful folks that have infiltrated governments and are running some of the largest businesses.

The fire refers to instant communication over the Internet. This relatively new technology has the potential to paralyze old power structures and reshape civilization. It's understandable why governments et al are panicking. They know their authority will wane under global free speech unless they do something.


Am I the only one that is repeatedly amused at how many smart people are just caving to making this about parents/children at all?

We've literally watched things unfold in real time out in the open in the last year I don't know how much more obvious it could be that child-protections are the bad-faith excuse the powers that be are using here. Combined with their control of broadcasting/social media, it's the very thing they're pushing narratives in lockstep over. All this to effectively tie online identities to real people. Quick and easy digital profiles/analytics on anyone, full reads on chat history assessments of idealogies/political affiliations/online activities at scale, that's all this ever was and I _know_ hackernews is smart enough to see that writing on the wall. Ofc porn sites were targeted first with legislation like this, pornography has always been a low-hanging fruit to run a smear campaign on political/idealogical dissidents. It wasn't enough, they want all platform activity in the datasets.

I can't help but feel like the longer we debate the merits of good parenting, the faster we're just going to speedrun losing the plot entirely. I think it goes without saying that no shit good parenting should be at play, but this is hardly even about that and I don't know why people take the time of day. It's become reddit-caliber discussion and everyone's just chasing the high of talking about how _they_ would parent in any given scenario, and such discussion does literally nothing to assess/respond to the realities in front of us. In case I'm not being clear, talking about how correct-parenting should be used in lieu of online verification laws is going to do literally nothing to stop this type of legislation from continually taking over. It's not like these discussions and ideas are going to get distilled into the dissent on the congressional floors that vote on these laws. It is in it's own way a slice of culture war that has permeated into the nerd-sphere.


I make this argument to neutralize the "protect the children" excuse and also delegitimize the age verification "solution" by pointing out that on-device settings are more effective and easier to implement yet rarely discussed.

There are some parents genuinely concerned with parenting. We should give them the tools to do that and thereby removing them from the discourse, then we can focus on the bad faith people that want more control. I think there are still enough well-meaning people in governments that if we popularize on-device settings, it will prevent age verification in at least a handful of countries, and that's good enough to keep the spark of the free Internet going until we figure out a more permanent solution.


> It's not like these discussions and ideas are going to get distilled into the dissent on the congressional floors that vote on these laws.

You think the idea of parents, not governments, being responsible for parenting doesn't translate well to voters? In the country founded on the idea of freedom from overreaching governance and personal responsibility?


that's not what i'm saying at all. i highlighted that that is quite literally the convenient narrative that's being used to get everyone squabbling amongst themselves. it is very clear that this is being used in bad-faith to get people to immediately side a certain way. yet here on hackernews we find dissenting viewpoints to that, rather than discussion about the entirety of it and what the real motives at play are. i am once again amused at the efficacy of the smokescreen here.

what i'm saying is these discussions around parenting have had zero impacts on preventing the passage/implementation of such legislation/policies to date despite many smart people in here understanding what's actually at stake. and it's very likely that these parenting discussions will again go on to have absolutely zero impact on preventing the continued impelmentation of id verification on platforms. these policies/legislations aren't simply being implemented because people have failed to fully thought-exercise out good/bad parenting styles enough yet in the marketplace of ideas, it's becoming a reality because we aren't collectively raising awareness of the downstream ways this legislation will be harnessed for shitty outcomes. we aren't talking about it for what it is, but instead talking about it in the way they want us to talk about it. these parenting discussion points have been beaten to death and nothing new or novel is being shared, and rather than looking straight at the wolves right here in the room with us (data brokerage & who benefits from this type of data brokerage & figuring out how to stop it) people just look at each other and get butthurt about idealogical parenting differences. it's literally a slice of the now-ever-so-common 2d culture war we're all acutely aware exists, right here on hackernews, and we're all actively participating.


I guess I disagree that there is some shadowy alternative motivation for these laws. If the goal was to link everyone's ID with their account they would be requiring everyone to send in their ID instead of making age estimation the first option. I'm also a bit confused about the data brokerage part. What do you imagine the data brokers get out of this?

This only works if I ban my child from having any friends since they all have unlimited mobile access to the internet.

Could your child not just call or text their friends? Or is the real expectation to not have to intervene at all about their preferred platform?

Only if all the other kids are not on social media. When I was in school, birthday parties and such were organised on facebook. If you were not on facebook, you weren't invited.

If everyone was banned from facebook we would have organised them via text messages or email. That's the main point of social media age restrictions, individually banning kids is too punishing on those kids so parents and teachers don't try. Doing it across the whole population is much better.


I think the idea is for the child see their friends in person... not call, text, or internet.

So even if their own child has no phone at all, they have access to the internet through other children's unlimited mobile access.


When I was growing up, we loved to lend the sheltered kids from the more conservative families media they weren’t supposed to have, like the Harry Potter books.

I'm saying they'll use their friend's devices.

Sorry, I know it's a hard line for parents to tread and it's really easy to criticize parenting decisions other people are making, but the "everyone else is doing it so I have to" always seems as lazy to me today, as it probably did to my parents when I said it to them as a teenager.

Is it more important to prevent your son from being weaponized and turned into a little ball of hate and anger, and your daughter from spending her teen years depressed and encouraged to develop eating disorders, or to make sure they can binge the same influencers as their "friends"?


We used to teach kids to be themselves and stand up for what they believe in and their own authenticity and uniqueness even in the face of bullying. That having less or other doesn’t mean your value is lesser or that you should be left out. Now we teach them… conform at all costs so you never have to risk being bullied or lonely?

The number of times I objected to my parents rules because my friends didn’t have those rules and the response was: “I’m not their parent.”

Is it more important to prevent your child from <...>, or to not be seen as an adversarial monster?

presumably being a parent is different from being a your child’s friend. There is overlap, but yes, sometimes being a good parent requires “laying down the law”.

With that being said, i think explaining _in detail_ why you’re laying down certain rules can go a LONG way toward building some trust and productive dialogue with your child. Maybe you’ll find out they are more mature than you give them credit, can loosen up a bit. Or maybe a reasonable compromise can be found. Or maybe they’ll be bitter for a few months, but they’ll at least understand “why”.


Yes if they do bad things like drunk, have sex and do drugs.

I would start with banning cellphones.


My greatest fear for my future young adult children is that they're on their cell phone all day and never have time to get in trouble with their friends, so there's that. Yes, Let's start with banning the cell phones.

this is the biggest problem, so many parents are head-in-the-sand when it comes to things that can damage a child’s mind like screen time, yet no matter how much you protect them if it’s not a shared effort it all goes out the window, then the kid becomes incentivized to spend more time with friends just for the access, and can develop a sense that maybe mom and dad are just wrong because why aren’t so-and-so’s parents so strict?

because their parents didn’t read the research or don’t care about the opportunity cost because it can’t be that big of a deal or it would not be allowed or legal right? at least not until their kid gets into a jam or shows behavioral issues, but even then they don’t evaluate, they often just fall prey to the next monthly subscription to cancel out the effects of the first: medication


Do you believe the research shows that screens in and of themselves are so powerfully damaging that being exposed for, what, a few hours a week at a friend’s house will cause them to require psychiatric medication?

So many questions. Are you campaigning against billboards in your city? Do you avoid taking your kids to any business that has digital signage? I assume you completely abstain from all types of movies and TV? What about radio or books?

What are you, personally, doing on HN?

Fascinating.


Individual Parents vs Meta Inc (1.66T mkt cap)

May the best legal person win!


Its more like age verification corporations, identity verification corporations, the child "safety" organizations that were lobbying for Chat Control versus individuals who want to protect their privacy.

Children are clever. I think the deeper issue is that very few parents care enough to actually articulate the danger to their kids.

As a kid, my dad sat me down and explained how porn could destroy my life. It's not hard to get people to act in their own self interest once they know what's destructive for them.

The problem is that most parents don't even understand just how damaging social media is.


> We'll try everything, it seems, other than holding parents accountable for what their children consume.

The mistake in this reasoning is assuming that they are actually interested in protecting the children.


This.

The world is becoming increasingly more uncertain geopolitically. We have incipient (and actual) wars coming, and near term potential for societal disruption from technological unemployment. Meanwhile social media has all but completely undermined broadcast media as a means of social control.

This isn't about protecting children. It's about preventing a repeated of the Arab Spring in western countries later this decade.

"Think of the children" is the oldest trick in the book, and should always be met with skepticism.


The Arab Spring was caused by a tripling of food prices. I somehow doubt something similar will happen in the west. As for the rest, ignoring the population's concerns (by suppressing social media) is the best way to cause political violence. So I see blocking the governments desires to shape political discourse as saving the politicians from themselves.

GP isn't interested in protecting children either. Punishing parents harder does nothing to improve the lives of children — in fact it makes them much worse, because now they are addicted to Facebook and their parents are in jail. It just makes certain people feel morally righteous that someone got punished.

"We'll try everything, it seems, other than holding parents accountable for what their children consume".

We'll try anything, it seems, other than hold internet companies accountable for the society destroying shit they publish.

And it's not jusy children who's lives they are destroying.


If you are interested in learning about the other perspective, you can watch some parents’ congressional testimony here https://youtu.be/y8ddg4460xc?si=-yYduYDppF4TQWqD.

The character.ai one is gut wrenching.


> then give parents strong monitoring and restriction tools and empower them to protect their children

I think this is the right way to solve the problem.

For example, I think websites should have a header or something that indicates a recommended age level, and what kinds of more mature content and interactions it has, so that filters can use that without having to use heuristics.


I agree with you that parents should be responsible, but your argument is clearly flawed.

> you can get in trouble if you recklessly leave around or provide alcohol/guns/cigarettes for a minor to start using

In the example here, there are 3 things where age verification is required AND parents have responsibility.

It’s not just one or the other.

The same responsibilities are not “thrown out”, they are never acknowledged in the first place.


As a parent blocking websights is a joy, maybe the rule should be to allow guardians more ability to control that. Trying to block some services is not trivial

As a human, I'd love to see the rest of you fools quit that. If HN ever starts to algorithm me I'll be gone too.


And as a bonus you can block your boomer parent's access to cnn and msnbc (or whatever its called now) and perhaps fox. It will make Thankgiving a lot more pleasant for all.

PS Mom, I don't know why cnn doesn't work anymore. ;)


If they live under my roof, it's my rules. Turn about is fair play.

As for news, the art of discovering what in your subjective reality exists in the objective reality is something I don't expect well ever get gud at.


I'm not Americans but isn't Fox the worst one out of those?

It's very easy to lock up alcohol/cigarettes, a child should never have access. Internet usage is more like broadcast media, a child should have regular access.

The positives and negatives of Internet usage are more extreme than broadcast media but less than alcohol/guns. The majority of people lack the skills to properly censor Internet without hovering over the child's shoulder full-time as you would with a gun. Best you can do is keep their PC near you, but it's not enough.

We agree that a creepy surveillance nanny state is not the solution, but training parents to do the censorship seems unattainable. As we do for guns/alcohol/cigarettes, mass education about the dangers is a good baseline.

EDIT: And some might disagree about never having access to alcohol!


Devices such as phones come with an option when you start the device asking simply is this for a child or an adult. Your router generally these days comes with a parental filter option on start up too. Heck we have chatgpt that can guide a parent through setting up a system if they want something more custom.

If people want to push, they should just push to make these set up options more ubiquitous, obvious and standardized. And perhaps fund some advertising for these features.


Router parental filters are accountability sinks. They don't actually work, and they can't because we spent the last 20 years redesigning network protocols to prevent middle boxes from tampering with connections.

In what sense? DNS blockers work generally do they not? Adguard also censors google search results.

I don't see why your kid should be browsing reddit.

I mean even only allow whitelisted sites. As I say this can be standardized further.

These measures I truly believe do not need to be 100% foolproof so long as the hurdle is high enough that children give up it's fine. And these measures could potentially notify a parent of a suspected breach or attempt to game it, without intruding too much into the child's privacy.


DNS blockers only work if the device/application is not adversarial or if you also have a smart enough firewall to block DoH, which is designed to blend in with web traffic. Once ECH is widespread, you'd likely need to MitM the device (so you need to install your CA, which is intentionally made very difficult and you might not even be able to do across all apps anymore on mobile devices? At least without enterprise MDM. And as was observed elsewhere[0], apps like spotify can contain a web browser), or perhaps use DNS requests as a trigger to briefly open a default deny outbound firewall.

Things have definitely been converging toward making it impossible for non-corporations to manage the devices they own, the network they run, etc.

[0] https://news.ycombinator.com/item?id=47128069


This is very interesting thanks.

I agree that ECH is perhaps a stumbling block although as you say MitM, this is indeed possible to pursue considering the whole set up child account on device thing going on with many of these devices.

On the rest of of your points fair enough, but again I ask is it actually proportionate? Are we talking about children or black hats?


The black hats in this case are the software vendors. If your software prevents any ability to inspect any of its traffic (so you can't use external filters), and the OS doesn't offer ways to override/hook into that, and if the inbuilt parental controls are insufficient, you can't do much.

What are you going to do when every application (including web browsers) simply ignores and bypass your DNS filtering "for security" and every site is opaque (e.g. wikipedia looks just like pornhub to your router and every site is using one of a small number of major frontend proxies like cloudflare that's actively specifically working toward traffic opacity)? It happens that every major commercial non-server OS vendor (except Redhat?) is an ad company now, so they all have a reason to block your ability to filter traffic/restrict your configuration to only what they allow. And they're all working toward that.


Good point well made.

This is where Apple, Microsoft and Android need to step up. Indeed they already have in many ways with things being better than they used to be.

There needs to be a strict (as in MDM level) parental control system.

Furthermore there needs to be a "School Mode" which allows the devices to be used educationally but not as a distraction. This would work far better than a ban.


I dunno man. IMHO, kids should not have access to devices of any kind until the brain develops. Im not sure what that number is, but lets say its 15. At that point, we as parents need to be role models and let kids make mistakes. There is this whole idea that if you focus too much on security, you open the door for increased risk. I feel this applies to this situation[0].

When I was a kid, when I reached a certain age, 13 I think, there was nothing my parents good could do to stop me from learning from my own mistakes. I think using blanket laws and tech to curb internet behavior is just going to backfire.

[0]: https://news.clemson.edu/the-safer-you-feel-the-less-safely-...


Microsoft has done a good job with Microsoft accounts and Microsoft Family Safety. It's about as user-friendly as you'll get outside of Apple, though the speed could be improved. And this only covers PCs, Android 's system is less good.

Even with this, the problem requires more than pushing a button. Time, thought, and adjustment are needed. Like home maintenance, its necessary but not everyone can do it without help.

Getting AI assistance is good advice.


They could provide all the tools in the world. Unless there’s legislation change to what children are allowed to consume legally, everyone will largely ignore it.

Ironically, the government that is pushing this only set a drinking age just a couple of years ago (as in the last 10 years). In case you believed this was actually about kids.

The speech that worked (mostly) on the children in my life involved the concept of 'cannot unsee', which they seemed to understand. There are some parallels to gun safety here because there are things that even the adults in your life try not to do and it seems perfectly reasonable to expect the same from children.

In fact being held to a standard that adults hold themselves to is frequently seen as a rite of passage. I'm a big girl now and I put on my big girl pants to prove it.


parents can be held liable for buying their kids cigarettes but, similarly, tobacco companies are (at least nominally) not supposed to target children in their advertising campaigns and in the design of their products.

It's obviously not a 1/1 comparison here, because providing ID to access the internet is not analogous to providing ID to purchase a pack of Cowboy Killers but we can extrapolate to a certain extent.

(inb4 DAE REGULATING FOR-PROFIT CORPORATIONS == NANNY STATE?!?!?!?!?)


You can expect the individual to compensate for a poorly structured society all you want.

> you can get in trouble if you recklessly leave around or provide alcohol/guns/cigarettes for a minor to start using

You can only expect so much from individual responsibility. At some point you need to structure society to compensate for the inevitable failures that occur.

> They are in a much better and informed position to do so than a creepy surveillance nanny state.

I'd rather live in a nanny state than ever trust american parenting. We've demonstrated a million times over that that doesn't work and produces even more fucked up people and abused children.


The richest brightest minds of our generation all being motivated towards one addictive goal, and we'll just put the responsibility on the parents...I think society can collectively do better.

If we expect Parents to treat Social Media like other unhealthy, dangerous, and highly addictive products, then that can never start with "just expect ignorant parents to all magically start doing something difficult, for no real reason".

It starts by banning kids from the internet, entirely. It starts with putting age restrictions on who can buy internet connected devices. It starts by arresting parents and teachers who hand pre-literacy kids an always-online iPad. It starts with an overwhelming propaganda campaign: Posters, Commercials, After-School Specials, D.A.R.E. officers, red ribbon week.

Then, ultimately, it still finishes with an age-gated internet where every adult is required to upload their extremely valuable personal information to for-profit companies, for free (With the added weight of being forced to agree to extreme ToS, like arbitration agreements).

So what do we do? I agree that the age of entry to the internet should match other vices (currently 21+ in the US, although really that should probably be 18+)...

It will never be acceptable for a single country's police state to extend across international borders, so... we just ban all of the UK and Australia from every web service until they get withdrawals and promise to stay nice? That could be a start.

But this whole situation in like 'freedom of speech' once you start picking and choosing what counts as "acceptable" speech, then suddenly you lose everything. You literally can't make everyone happy, because everything subjective is open to contradiction - and because there are freaks in the world who will never be satisfied by anything less than a complete global ban of everything.

Who gets a say? Do the Amish get to tell us what we are allowed to do? Where do you draw the line? You can't. Completely open is the only acceptable choice. But I still vote we start publicly mocking the parents who give their kids an ipad, and treat them like they just gave that kid a cigarette. Because seriously, they're ruining that kid.


I've already thought about it from the US's perspective and here's my path forward.

If government does not want kids to have access to the naughty bits of the internet but thinks there's something worth sharing with children then the government should provide a public internet for kids and THATS the site that will ask for a login known to belong to a kid. We already do public schooling with public funding and we do not let rando adults sit in classrooms with kids and they get a school id. Boot <18's off the public internet AT THE SOURCE when internet connectivity is PURCHASED / CONTRACTED FOR with a valid adult id / proof of age, but allow them vpn access into whatever the government thinks the child should have access to, like the schools page, I would say online encyclopedias or wikipedia type things but I'm not sure if government wants children to read about the variety of so many different things on this planet we're sorta trapped on and lets face it, restricting communications of the kids to points outside the control of parents is exactly what the government is complaining about, the government does not want kids to have free access to information.

Think of a phone or tablet that can only access the network through either a proxy or vpn but otherwise locked down. It certainly seems like it doesn't require much programming, heck have trump vibe code it for all I care.

I mean yeah, parents could just teach their kids the tough stuff because thats how it used to work anyways, well that and the libraries and schools but those can be pruned of bad books and bad teachers at the request of government anyways right? The kids could also be interviewed periodically by government to inventory what topics they have discussed with their parents to weed out the 't' or 'g' words.

I mean yeah I don't see a place for facebook in that intranet but isn't that sort of the point, we all know big social media will be incentivised to promote engagement with less regard for safety, so why do kids need facebook anyways? The instagrams and ticktalks are worse although maybe government should make a child friendly ticktalk type school social network, call it trumps school for kids for all I care, folks in power right now and a significant part of the US believe that trump knows whats good for kids right?

I mean obviously the libraries have to be REALLY REALLY cleaned up but thats just a detail. But why are parents forcing internet wierdos onto their kids with these smartphones / porno studios in their pockets? What do they think chester the molester on ticktalk is gonna have the kid upload their id? even if he does, do we really want that? c'mon man


The difference with guns, tobacco and alcohol is huge: all negatives aside, giving kids what they want makes the life of a parent so much easier. Take it away and many parents will fight. Sugar is in the same game.

I'm 40. Do I need to get my parents to vouch for me? Who vouches for them?

> then give parents strong monitoring and restriction tools

As written, this sounds very glib. I cannot take this comment seriously without a game theory scenario with multiple actors.


Should we allow kids to get cigarettes? Cocaine? Should all parents "just" be better parents and problem solved?

What is your proposed recourse for me as a parent if your kid shows my kid gore videos?

What is your recourse if his kid gives yours a bottle of whisky?

you can’t blame it on parents alone, but the odds are stacked against children and their parents, there are very smart people whose income depends on making sure you never leave your black mirror

the surveillance state is possible, achievable, and a few coordination games away from deployment with backing from a majority who should know better

inertia kills, I dunno


The greatest of uphill battles in today's current climate is trying to push anything in the realm of personal responsibility.

Politicians' whole basis for nearly every campaign is "you're helpless, let us fix it for you."

For the vast majority of problems plaguing society, the answer isn't government, it's for people changing their behavior. Same goes for parenting.

But unfortunately, "you're an adult, figure it out" isn't the greatest campaign slogan (if you want to win).


It wasn’t always this way: “Ask not what your country can do for you — ask what you can do for your country”

What? Are there billion dollar companies with huge staffs who are constantly trying to figure out how to sneak my child a gun all the time, at school, wherever they go?

I'd say this comparison is good -- we as a society have decided that people who provide alcohol, guns, and cigarettes are responsible if children are provided them. You don't get to say 'hey, you didn't watch your child, they wandered into my shop, I sold to them 2 liters of vodka and a shotgun'.


Yes, children are clever - I was one once.

A counterargument to your point that children are clever - I was also one once.


The vast majority of parents aren't tech-savvy enough to be able to operate IT parental controls.

Except companies provide wholly inadequate safeguards and tools. They are buggy, inconsistent, easily circumvented, and even at time malicious. Consumers should be better able to hold providers accountable, before we start going after parents.

The only real solution is to keep children off of the internet and any internet connected device until they are older. The problem there is that everything is done on-line now and it is practically impossible to avoid it without penalizing your child.

If social media and its astroturfers want to avoid outright age bans, they need to stop actively exploiting children and accept other forms of regulation, and it needs to come with teeth.


How easy is it for kids to bypass Parental Controls on iOS devices?

Social engineering is the most effective strategy, because iOS screen time controls are so buggy that eventually parents throw up their hands in exasperation and enable broader access than they would otherwise choose.

It’s one setting to only allow a whitelist and not allow apps to be downloaded. Yes parents might actually need to learn technology

I use it, I am quite familiar with the bugs. The app controls randomly duplicate themselves and change in scope. It would almost be comical if it had not been happening for so many years to so many people. Apple knows, does not care.

When everything is turned off by default, iOS Screentime is very effective. It also has effective tools for to grant certain exceptions, facilitated by Messages. It also distinguishes between "daytime" and "downtime" for the purpose of certain apps and app attributes, like the contact list. For example, we have ourselves, grandparents and the neighbors as "all the time" contacts but their friends as daytime only. They don't retain their devices at night but it is possible for them to pull them from the charging cabinet.

> Except companies provide wholly inadequate safeguards and tools. They are buggy, inconsistent, easily circumvented, and even at time malicious. Consumers should be better able to hold providers accountable, before we start going after parents.

We could mandate that companies that market the products actually have to deliver effective solutions.


Cue blog posts about section 230 and how it’s impossible to do hard things and parents should be held accountable not companies, meager fines, captured bureaucrats, libertarians, and on and on…

Yes, but how on earth is their malicious compliance at providing parental controls a good reason to go for the surveillance state that hurts absolutely everyone?

Social media operators love the surveillance state idea. That's why they aren't pushing against this.

I even cancelled YT Premium because their "made for kids" system interfered with being able to use my paid adult account. I urge other people to do the same when the solutions offered are insufficient.


Step 0 is physical device access. Kids shouldn't have tablets or smartphones or personal laptops before age 16.

16 is a bit steep but I do generally agree with your sentiment. I wish there were more educational home computers like there were back in the day like the BBC micro. I have a startup idea to make something like that (mostly as a dumping ground for my plethora of OS-software and computer education ideas) but don't currently have the resources and have doubts on how successful something like that would even be in this day and age. I'm only 18ish (Not giving my actual age for privacy reasons but it's within a 5 year margin) and feel like my peers would rather be locked to platforms and consume than learn to create and actually use computers despite there being a very obvious need (I once had a 20 year old look at me like I had 2 heads for asking them to move something into a folder)

> Kids shouldn't have tablets or smartphones or personal laptops before age 16.

If you make such a restriction, they'll secretly buy some cheap "unrestricted" device like some Raspberry Pi (just like earlier generations bought their secret "boob magazines").


Parents should have an allowlist of devices to be able to join their network. And then they can require root certs or something for access outside of a narrow allow list. There's a host of ways to solve both problems. Just remember to check for hardware keyloggers on your (the parents') devices, as kids could use them or try evil maid attacks, etc. if they feel totally encaged.

This will only work in practice if one of the parents is a network technician. :-)

I've said it before but prohibition works, if the goal is to reduce usage. I don't see this as a realistic problem.

This is the craziest thing I’ve heard in a while. They shouldn’t have connected game systems either?

No, because those devices have little or no controls and those controls are easily bypassed and/or not honored by the platform.

I think they should. Theres a fine line between beneficial and detrimental. I had a 3DS growing up and could browse the web with its very gimped browser, and I think something like that is actually very good for a child (able to access the internet and view simple and informative sites while being too limited to access social media and the like)

The problem is unrestricted access to mobile devices. A game console or desktop PC isn't as big of a deal.

What’s the difference? They all reach the same internet

Have you ever visited any game store and turned off nsfw protection?

I love gaming, but I hate all the smutt games. It discredits the medium, essentially what has also happened to anime.


I'm kinda baffled about the Switch store's quantity of dating/whatever adult-ish games.

I don't really want to turn on age-based filters (to the point that I've never investigated if they even exist) but at this rate, there's hardly anything worth looking at in the recent feed.


The target demographics for Nintendo products have shifted from kids to.. kidults? Most kids nowadays play on phones or in rarer cases PC/Xbox, Nintendo's lost much of their cache (in my visible experience) save for children parented by the "mindful milennial" types

Makes sense but there's just... so much of it. That and all the shovelware.

It's just hard to imagine that's anything close to what Nintendo wants users to experience, but I guess they need the money.


They really could find a niche in making phones for kids that have walled-garden internet access, they were so good at doing so with the ds but alas..

I bet not many of us would be here now if we hadn't had our own computers before age 16.

Today's young people are already technologically retarded (in the literal sense) and barely know how to use Microsoft Word or navigate with a file explorer, this would make the problem significantly worse.

I hope they do pass a law like that, because it'd give my kids a gigantic advantage over the kids who had no access modern technology and the free flow of information until the age of 16. If you want to leave your kids completely unable to find any kind of gainful employment in the AI era, be my guest.

> If you want to leave your kids completely unable to find any kind of gainful employment in the AI era, be my guest.

Your kid is screwed either way. Unless he moves to India.


The parents themselves weren't raised with the digital literacy required.

This doesn't put the parents off the hooks, if you or anyone can share any resources that are as easily consumable, viral and applicable as the content that is the issue that can reach parents I would be happy to help it spread.

The reality is kids today are facing the most advanced algorithms and even the most competent parents have a high bar to reach.

The solution is simple.

I want to permit whatever the pixels are on a childs screen. Full stop. That hasn't been solved for a reason. Because developing such a gate would work and not allow algorithms to reach kids directly and indirectly.

The alternative is not ideal, but until there's something better, what it will be and that's well proven for the mental health side of things of raising resilient kids who don't become troubled young adults - no need for social media, or touch screens until 10-13.

There are lots of ways to create with technology, and learning to use words (llms) and keyboards seems to increasingly have merit.


> "The parents themselves weren't raised with the digital literacy required."

At this point, that isn't true anymore. There was social media when the parents were school aged. The world didn't start when you were 10 and the Internet is a half century old.


I thought the same thing until someone asked me how many of them have been able to overcome digital addiction and set a path ahead that's healthy.

Being literate in something isn't just knowing how to use it, but how to manage it's use for one's self and for others.

Once you see the importance of it, knowing where/how to start to manage what kids are exposed to that is age or developmentally appropriate for them is entirely a different skill to meet and manage the digtial literacy of another human, especially a child.


> then give parents strong monitoring and restriction tools and empower them to protect their children.

Because parents don’t abuse massive surveillance tools.

Given that most abuse happens in the family and by parents maybe it’s a bad idea to give them so much power


So we should trust the governments of the world? The same governments that don't seem to be doing anything about a large group of people that visited a specific island to abuse minors?

Where did I say that?

Exactly, nowhere.

If I‘m contra B, it doesn’t mean I pro A


What other realistic option is there if parents aren't going to be empowered to raise their own children?

don't you have to age verify to get alcohol? We don't leave that up to the parents. Feels like you defeated your own argument with your examples.

The internet is not an object, its a communication medium. That is an apples to organges argument, it doesn't wash.

None of this push has anything to do with protecting children. Never has, never will. Stop helping them push the narrative, it's making the problem WAY worse.

ITT are a lot* of tech workers who made their money as a cog in the system poisoning the internet that future generations would have to swim in. I wonder if toxic waste companies also tell the parents it's strictly on them to keep their kids out of the lakes that are poisoned, but once flowed cleanly?

We live in a shared world with shared responsibilities. If you are working on a product, or ever did work on a product, that made the internet worse rather than better, you have a shared responsibility to right that wrong. And parents do have to protect their kids, but they can't do it alone with how systematically children are targeted today by predatory tech companies.


Age verification tech companies are lobbying heavily for governments to legally require their services. The proposed "solutions" are about funneling money into the hands of other tech companies and shady groups, while violating user privacy.

If anything, we should be banning the collection of any age related information to access social media and more mature content. We need companies to respect privacy, rather than legislation even more privacy violations.


If anything, we should be preventing young people from being exposed to the version of the internet that currently exists until the tech companies that made it this way offer a solution. I am all ears if you have an alternative that big tech can implement to ensure this is the case while they are given the task of cleaning up the mess they've made?

Bro the internet was made by everyday people. Corporations just imposed there shit on top of it. Im all for the corporate part going away, but I think its better if we make social media corporations transparent so we can target how they are operating those services. Age gating users is not the answer

I've been on the internet for more than 20 years. It got a lot worse in the last 10. Individuals maybe shaped it in the early days, but the disastrous mess we have today is from the monetization and ensuing garbage that was pushed onto the world by some very profitable tech companies.

Undo the damage or otherwise come up with a way to shield kids from it. I won't let my own kids anywhere near the open web the way it is today. It's poison for young minds and needs to be fixed or gated off. Like alcohol at this point.


It comes from a combination of things that always existed getting online and the monetization of the attention economy. Influence operations (both corporate and governmental) are the source of most of the problems. Bots, influencers pushing propaganda, etc. I suspect you are actually a bot but others might read this so...

The biggest changes to the Internet over the last few years are usually in the political spaces. There are a few other things but mostly its political. Those other things always existed but now they are online. But this isn't the fault of the communications medium, its the ills of society leaking into online spaces. If we banned those things online, you still as a parent have to worry about them happening IRL. Its better to talk to your kids about these dangers honestly and it always has been. Its always been easier to just prevent your children from being exposed to those dangers but that usually backfires later on. Banning unpopular political discourse to do that has never been the answer to these issues. But in this case, banning discourse is the goal and children are just the excuse. As proof of this, the same government pushing this only instituted a real drinking age in the last 10 years, in a country known for making liquor.


> I suspect you are actually a bot but others might read this so...

I'm floored lol. What gives you this impression?

The worst part of this inflammatory nonsense is that, sadly, I'm probably the only person that will read your full comment. And I fundamentally disagree with your thesis of attributing this to "politics". Social media and its effects were poisonous long before "politics" were so prominent. You could see it even during early Obama times. The simple infinite scroll and forcing individuals to so regularly compare themselves to each other was already awful long before "politics".


The last line in my statement answers your question. If you leave it up to government to try and regulate a medium you are asking for trouble. Its like telling a news source what they can and cant release news wise because a portion of the population (kids) are harmed by the information.

I understand where you are coming from but age gating is not the answer for a communication medium.


I'm asking you for the alternative. Every day this continues on is literally ruining lives before they start. Like lead in water, time is of the essence. So what is the alternative to fix it?

Support candidates that will put anti-trust first and end citizen united. Both of these issue make holding companies accountable for the harm that is being caused by lack of transparency impossible. The problem is corruption and corporations not being accountable to the people. When those problems get resolved. These issues will become less problematic.

I love the vision. Might take a decade or more to play out. We don't want a list generation in the interim, so what do we do asap to get it under control?

We don't. The problems are created by corporate greed, they are only solved by dealing with that. Making the internet less free as I said, isn't the answer, and there is no way to fix this in the short term without making the corruption worse.

Respectfully, I disagree and find your proposed solution, akin to "keep letting young people have their lives ruined", unsatisfactory. Which is probably why we're in this mess.

Its been my experience that our lives, whether they are ruined or not, are up to us. Making someone else responsibility only prolongs the problem. You either see that, or you don't. It's pretty clear at this point that corruption and greed is the problem and the fact that we cant see our way forward to being responsible adults is the part that is going to cause humanities downfall. When everything come crashing down, the people that will be left are the people that are taking responsibility for the problem and not making it someone else's.

> Its been my experience that our lives, whether they are ruined or not, are up to us.

This maybe applies to adults. It does not to children that cannot yet fend for themselves. You are basically throwing them to the wolves. This can be your choice, but it won't be mine.


We are not throwing them to the wolves. Our complacency created this problem and now kids are being affected by our actions and inactions. IMHO the best we can do as parents is try to protect them in a world gone mad. Appealing to governments and corporations that created this problem in the first place (with our acquiescence) is going to make the problem worse because we have evil people behind the scenes using this information against our wishes.

>If you are working on a product, or ever did work on a product, that made the internet worse rather than better, you have a shared responsibility to right that wrong.

This is how the "predatory debt" involved has built up, and grown exponentially until now, and the only thing Facebook considers as a solution would be to pay it down using other peoples' resources instead of their own.

No one else has matching leverage and the dollar figure would be many billions if not a full trillion or more, which is about what it's worth, and who else could afford that except Facebook?

So it has to come from the collective subtraction of everyone's complete privacy. Just to amount to something comparable.

Add that up and it shows you how valuable privacy really is and what it's worth in dollar figures.

Yes, do the math, privacy is worth more than Facebok no matter what, it always was and always will be.

You can't have both, so big tech should jettison Meta. Who else could afford it?

A more non-existential solution would be for Meta to fully fund a completely anonymous internet to replace the one that they soiled from the beginning, and let them keep the (anti-)social-media exclusive network separate.


I'm with you.

This is what I thought when Facebook first came out;

It was going to be like MySpace where most people were expected to remain anonymous like the internet had always been, and only those who actually wanted to be identifiable could reveal as much information as they personally wanted to.

But no, Facebook wanted everybody's personally identifiable information as table stakes, not only those who really wanted to promote themselves or gain personal recognition.

There was no other way to sign up.

I thought people would be too smart for that. But Facebook was "free" to use, and learned a lot from it's first major gamescourge, Zynga.

Naturally I've been waiting for it to stand the test of time, and it does look like it has been a complete failure when it comes to being worthwhile.

Facebook started out with enshittification as a business model but the next major escalation came when people had to have an "account" before they could even browse the site any more.

People who had actually enjoyed it were somewhat pressured to join just so they could continue following those who were promotional. Linkedin did this too and made it no longer worth visiting either. So much for supporting the members who were intended to be promoted.

You can only imagine my shock years ago when I found out Facebook was a billion-dollar company.

Things like this were never even supposed to be worth money.


If your goal is to "save the children", then sure, we can discuss this... if your goal, as a government, is to have everyone get some digital ID and tie their online identities to their real names, then you do just that.

We'll do everything, it seems, other than holding billionairs accountable for what their businesses consume.

We should stop pretending these age-verification rollouts are about protecting children, because they aren't and never have been.

Even if the world was full of responsible parents, there are still people and groups that want to establish a surveillance state. These systems are focused on monitoring and tracking online activity / limiting access to those who are willing to sacrifice their own personal sovereignty for access to services.

There is most definitely a cult that is obsessed with the book of revelation and seeing Biblical prophecy fulfilled, and if that isn't readily obvious to folks at this juncture in time, I'm not sure what it will take. I guess they'll have to roll out the mark of the beast before people will be willing to admit it.


It's funny, all the bible wankers screamed about "the mark of the beast" over things like RealID. Now we have fascists setting up surveillance and censorship tools to tie speech and movement to centralized ID...and they're lining up to lick boots.

You should need to show ID and prove you're over 18 to enter a church. At least we know they're actually harmful to children.


“One day I’ll own that boot…”

The people pushing this are the same ones who are always screaming about "fascists". Also, your ideas in your post are anti-liberal and anti-constitutional (in the US).

In the context of government-mandated identity checks for speech, either both are unconstitutional or they're not, in the latter case it's time to start cracking down on the dangers of religion.

I hope society comes to the former conclusion and the egregious attack on freedom of speech on the internet is discontinued.


A strict reading of the constitution would also imply that limiting gun ownership to those who show ID and can prove they are 18 is unconstitutional. "Anti-liberal" and "anti-constitutional" are in the eye of the beholder.

> We'll try everything, it seems, other than holding parents accountable for what their children consume.

It’s not a fair fight. These are multi-billion dollar companies with international reach and decades of investment and research weaponized against us to make us all little addicts.

Additionally, it’s not fair or reasonable to ask parents to screen literally everything their kids do with a screen at all times any more than it was reasonable for your parents to always know what you were watching on TV at all times.

This is bootstraps/caveat emptor by a different name. It’s not “I want someone else to raise my kids.” It’s “the current state of affairs shouldn’t be so hostile that I have to maintain constant digital vigilance over my children.” Hell if you do people then lecture you about how “back in their day they played in the street and into the night” and call you a helicopter parent


You are screwed but not for the reason you claim. Its because you don't take any accountability for yourself. There is/was no hope for someone who does that at any point in human history. Is it fair? Nope...but it also doesn't mean you have 0 autonomy.

So when people try to take accountability in a democratic government, by changing the law to what they want through democratic means that suddenly is having no accountability for one's self?

Good lord, Silicon Valley must have lead pipes.


> You are screwed but not for the reason you claim. Its because you don't take any accountability for yourself.

That was an incredibly rude personal attack and completely unwarranted. You cannot talk to people like that here.

I won’t be discussing this with you further. Have a good rest of your week.


Excellent example of low effort cookie cutter empty rhetoric that would fit perfectly in reddit.

Do you have kids?

>> In the United States, you can get in trouble if you recklessly leave around or provide alcohol/guns/cigarettes for a minor to start using, yet somehow, the same social responsibility seems thrown out the window for parents and the web.

So anyone can walk into a shop and purchase these things unrestricted? It's not the responsibility of the seller too?


Tobacco, yes you can order pipe tobacco and cigars online sent to your house without ID.

Guns yes, you can buy a schmidt-rubin cartridge rifle or black powder revolver sent straight to your home from an online (even interstate) vendor no ID or background check, perfectly legal.

Alcohol yes, you can order wine straight to your house without ID.

These are all somewhat less known "loopholes" but not really turned out to be a problem despite no meaningful controls on the seller. You probably didn't even know about these loopholes, actually -- that's how little of a problem it's been.


>We'll try everything, it seems, other than holding parents accountable

The government took over most parenting functions, one at a time, until the actual parent does or is capable of doing very little parenting at all. If the government doesn't like the fact that it has become the parent of these children, perhaps it shouldn't have undermined the actual parents these last 80 years. At the very least, it should refrain from usurping ever more of the parental role (not that there is much left to take).

You yourself seem to be insulated from this phenomena, maybe you're unaware that it is occurring. Maybe it wouldn't change your opinions even if you were aware.

>If you want to actually protect children

What if I don't want to protect children (other than my own) at all? Why would you want to be these children's parents (you suggest you or at least others want to "protect" them), which strongly implies that you will act in your capacity as government, but then get all grumpy that other people are wanting to protect children by acting in their capacity of government?


The expectations on parents in USA are at their historical high. What are you talking on about in here. The expectation that parents will perfectly supervise them at every moment of their life till their adulthood is a.) new b.) at its historical max.

> holding parents accountable for what their children consume

There is a local dive bar down the street. I haven't expressly told my kids that entering and ordering an alcoholic drink is forbidden. In fact, that place has a hamburger stand out front on weekends and I wouldn't discourage my kids from trying it out if they were out exploring. I still expect that the bartender would check their ID before pulling a pint for them.

It takes a village to raise a child. There are no panopticons for sale the next isle over from car seats. We are doing our best with very limited tooling from the client to across the network (of which the tremendously incompetent schools make a mockery with an endless parade of new services and cross dependencies). It will take a whole of society effort to lower risks.


also there's a huge argument to be made that surveilling your kids is really really bad for their development

Yes, my spouse and I were very conscious of this. My kids are now at an age where some of the just-in-case tracking chafes and they ditch trackers and turn off location on their watch. Its a normal renegotiation that occurs as they pass through various maturity thresholds. The older of them has thusfar rejected phones and watches and uses Omarchy on an old Thinkpad.

That same argument doesn't hold water on the internet. Its a communication medium. Its like a flow of information. You don't enter or leave physically spaces. the information flows to you where ever you are. trying to apply the same kinds of laws to the internet is a recipe for disaster because you are effecting everyone at the same time.

Yes, afaik authentication is performed by applications at L7 and as such flows via Internet protocols like anything else.

All kinds of laws are applied to services provided via Internet. For example, once upon a time people said collecting sales tax was an insurmountable problem and a disaster for ecommerce. Time passes and what do you know, people figured out ways to comply with laws.


Your example focused on time and place because taxes are done at a transactional level between the person purchasing goods online and receiving those goods in physically.

Age gating is not the same thing, there is no transfer of goods. It's someone's arbitrary idea of what should and shouldn't be allowed on the internet. And it's pretty clear at this point that it's about control over information. Plenty of articles on the subject if you care to look.


Taxes also apply to services and information, not just goods. I just checked some invoices to double-check my recollection.

You have made a claim that age gating some online services is an "arbitrary idea." I don't see how that is different from taxes at all. Taxes are likewise an "arbitrary idea." Taxes are likewise a societal control measure.

There is no need for articles to explain a very straightforward truth. If you are unable to make the case for something, claiming unspecified writings elsewhere doesn't get you any further.


We live in a technofeudalist society now, we're all at the whims of the tech corps

age verification doesn't work in favor of a tech corp like facebook as they will see some users leave, some because they don't have the age required and some because they don't want to do the verification

> We'll try everything, it seems, other than holding parents accountable for what their children consume.

The way to keep kids from eating (yummy) lead-based paint chips was not holding parents accountable to what their kids ate, but banning lead-based paint.


It’s weird that you blame the victim.

The real question is why do we leave it to parents or intrusive surveillance instead of holding companies accountable?


This tired argument again. It doesn’t work. It’s like keeping your kid from buying alcohol but all their friends are allowed to buy it. The whole age demographic has to be locked out of the ecosystem.

Well, yes. If your friends can all go 'round to David's house, where David's parents hand each child a case of beer and send them on their way, any attempt by the other parents to prohibit underage drinking is going to be ineffective. But most parents don't do that. (I've actually never heard of it.) So social solutions involving parent consensus clearly do work here.

"But it's behavioural!" I hear you cry. "What's stopping children from going out, buying a cheap unlocked smartphone / visiting their public library / hacking the parental control system, and going on the internet anyway?" And that's an excellent objection! But, what's stopping children from playing in traffic?


Yeah but it’s illegal for the parents to give the other kids beer with serious criminal repercussions. That’s why most people make sure it doesn’t happen, not just some social sense of reponsibility. You would need something similar for smartphones/social media.

    That’s why most people make sure it doesn’t happen
Were you not invited to parties in high school? My experience growing up (and my experience being a neighbor to people with teenage children even now) says otherwise.

> Were you not invited to parties in high school?

Did you forget what web site you're on?


Every high school and college freshman party I’ve been to involves some serious planning to find alcohol. It’s always hit or miss and not easy.

The US generally has strict anti-alcohol laws, with exceptions for legally-recognised familial relationships (e.g. children, spouses). The UK doesn't: its laws are restricted to "the relevant premises" (https://www.legislation.gov.uk/ukpga/2003/17/part/7/crosshea...) and "in public" (https://www.gov.uk/alcohol-young-people-law – can't find the actual law right now); but still, the behaviour I described does not occur in the UK often enough for me to have heard of it. I have, however, heard about similar behaviour from the US, where "we all go out late at night and become alcoholics" seems to be a culturally-acceptable form of teenage rebellion.

People, for the most part, have no respect for the law. They usually haven't even read the law. They have respect for what they consider appropriate or inappropriate behaviour. (Knowingly breaking the law is, in most instances, considered an inappropriate behaviour – except copyright law, which people only care about if there are immediately-visible enforcement mechanisms. Basically everyone is fine with copying things from Google Images into their PowerPoint presentations… but I digress.) Most people would object to murder, even if the law didn't forbid it. This distinction is important.

Is there a law that says "children must not play in traffic"? Probably! Haven't the foggiest idea which it would be, though. That law (if it exists) is not why children don't play in traffic. The law against giving alcohol to children (if it exists) is not why we don't give alcohol to children. We can establish similar social norms for deliberately-addictive, deceptive, dangerous computer systems, such as modern corporate social media.


We can establish social norms, but companies have a tendency to ignore those norms if it makes them money and it isn't illegal (maybe not all or even most companies, but if it's profitable, some company will do it and expand into that niche). So it makes sense to make it illegal for those companies to provide services to children, and then establish a social norm that parents won't create an account for their children/bypass the checks that companies need to do. Just like with alcohol: it is illegal for stores to sell it to minors, and they must check ID; we don't just let them shrug and say a 14 year old looked 21, and at least in the US, that would be a criminal offense. It's then socially unacceptable (and maybe also illegal) for a parent to buy a ton of alcohol so their kid can host a rager for all of their friends.

Drawing out the alcohol analogy further, you can actually buy alcohol on Amazon, subject to an ID check. I'm not sure why no one bats an eye at this, but somehow e.g. porn or other adult-only services are different.

It's long been an established, reasonable stance that it is both the parent's responsibility and decision to allow or deny certain things, and it's also illegal for businesses to completely undermine the parent's ability to act as that gatekeeper for their kids.


> So it makes sense to make it illegal for those companies to provide services to children

I'm in favour of this, so long as the restriction is narrow. Children shouldn't be on Facebook, but they should be able to participate in the RuneScape forums under a pseudonym, or contribute to Wikipedia (provided they understand the "no, nothing can be deleted ever" nature of the edit history).

However, most of the things we'd want to prohibit for children, aren't actually good for anyone. It would be much easier, in one sense, to blanket-ban the bad guys: no new accounts may be created on services like Facebook or Discord, unless they change their ways.


Just because you haven't heard of it doesn't mean it isn't common. Parents take different approaches. I had some friends parents who preferred we did it in their house where they could maintain some level of safety than us drinking recklessly in field. Others thought providing some beers was better than us buying the cheapest vodka available. And I'm sure other parents wouldn't have liked this approach if they knew about it.

I'm familiar with the "semi-supervised drinking inside" approach. "Provide beer so they don't drink cheap vodka" isn't an approach I'd heard of; it's close enough to my Poe's-law straw position to weaken my argument.

The thing is, what are the parents to do beyond restricting things? You find out some creep has been talking to Junior; do you talk to your local police department, state agency, or to the feds?

We've never properly acted upon reports of predators grooming children by investigating them, charging them, holding trials, and handing down sentences on any sort of large scale. There's a patchwork of LEOs that have to handle things and they have to do it right. Once the packets are sent over state lines, we have to involve the feds, and that's another layer.

Previously, I would have said it's up to platforms like Discord to organize internal resources to make sure that the proper authorities received reports, because it felt like there were instances of people being reported and nothing happening on the platform's side. Now, given recent developments, I'm not sure we can count upon authorities to actually do the job.


Back in the day you would beat up that person.

> The thing is, what are the parents to do beyond restricting things?

Well, I can't speak for parents (as in all parents). I can, however, tell you what we did.

When two of my kids were young we gave them iPods. The idea was to load a few fun educational applications (I had written and published around 10 at the time). Very soon they asked for Clash of Clans to play for a couple of hours on Saturdays. We said that was OK provided they stuck to that rule.

Fast forward to maybe a couple of months later. After repeated warnings that they were not sticking to the plan and promises to do so, I found them playing CoC under the blankets at 11 PM, when they were supposed to be sleeping and had school the next day.

I did not react and gave no indication of having witnessed that.

A couple of days later I asked each of them to their room and asked them to place their top ten favorite toys on the floor.

I then produced a pair of huge garbage bags and we put the toys in them, one bag for each of the kids.

I also asked for their iPods.

No anger, no scolding, just a conversation at a normal tone.

I asked them to grab the bags and follow me.

We went outside, I opened the garbage bin and told them to throw away their toys. It got emotional very quickly. I also gave them the iPods and told them to toss them into the bin.

After the crying subsided I explained that trust is one of the most delicate things in the world and that this was a consequence of them attempting to deceive us by secretly playing CoC when they knew the rules. This was followed by daily talks around the dinner table to explain just how harmful and addictive this stuff could be, how it made them behave and how important it was to honor promises.

Another week later I asked them to come into the garage with me and showed them that I had rescued their favorite toys from the garbage bin. The iPods were gone forever. And now there was a new rule: They could earn one toy per month by bringing top grades from school, helping around the house, keeping their rooms clean and organized and, in general, being well behaved.

That was followed by ten months of absolutely perfect kids learning about earning something they cherished every month. Of course, the behavior and dedication to their school work persisted well beyond having earned their last toy. Lots of talks, going out to do things and positive feedback of course.

They never got the iPods back. They never got social media accounts. They did not get smart phones until much older.

To this day, now well into university, they thank me for having taken away their iPods.

So, again, I don't know about parents in the aggregate, but I don't think being a good parent is difficult.

You are not there to be an all-enabling friend, you are there to guide a new human through life and into adulthood. You are there to teach them everything and, as I still tell them all the time, aim for them to be better than you.

https://www.youtube.com/watch?v=99j0zLuNhi8


This reads like something I'd find on /r/LinkedInLunatics, all the way down to the one-sentence/thought-per-line formatting.

My parents took the same approach and it helped, but I will anecdotally point out that kids have played video games under covers for a while, even when I was young, I remember getting in trouble for playing this spyro game n' watch clone from mcdonalds at night, or gameboy with one of those lamps that plugged into the serial port. When I become a parent, I think I'd feel understanding of something like this, but would likely still only give them access to hardware like cell-enabled apple watches or DSes. The issue I take with modern games like CoC is that they are psychologically engineered to be mentally harmful, and push you to spend real money on fake things. I've seen many peers who were engaged in CoC as kids get into online gambling and sports gambling recently, it doesn't sit right.

> The issue I take with modern games like CoC is that they are psychologically engineered to be mentally harmful

Precisely. I am not saying I am perfect as a parent or that this was the best possible approach to the situation we had. Nobody is and perfect parenting is an absolute myth.

I knew full well just how addictive gaming could be because I experienced it in my 20's. Needless to say that the "shock and awe" consequence to their deceit was not the result of a single data point. We had been seeing changes in behavior over time (six months or so). The objective was three fold: Take away the device that delivered the addictive behavior. Take away something of value to them. Make them earn it back with positive behavior.

The decision was not planned and the consequences were not communicated in advance. Few things in life are like that. Sometimes people discover the consequences of their actions (or understand them) when they are sprung on them because of something they did. Drunk driving being one possible (though not perfect) example of this.

In this case, it worked. Perhaps we got lucky. Not sure. I also did highlight that I cannot speak for all parents. I did the best I thought made sense at the time. Based on the outcome, many years later, I can say it worked.

To the critics on this thread: Your mileage may vary. Some of the comments sound juvenile, perhaps you'll understand if you ever become a parent and face similar circumstances. Then see what you think of someone who thinks they know better from behind a keyboard than you did in the moment and without having to be responsible for the outcomes (which is a multi-year commitment).


You probably figured, but I am likely the same age to your kids, I agree that the similar "shock and awe" nature with which my parents treated this stuff was warranted, and in fact I wish they went a little further, but even hiding the batteries to all devices and only allowing them out for a couple hours a day was progress. The problem I see coming my way is that the cultural monolith has degraded to the point where an online kid and offline kid can't coexist, it was already pretty strained when I was a high school student in the '10s, isolation isn't the answer, and in my own experience while one can tolerate being "weird", the lack of a shared culture is often dislocating. At this point I'm just hoping there's somewhere I could find with with like-minded parents

The issue with any parent's narrative, including yours, is that it's one-sided. We'd need the story told by the children-turned-adults to make any fair judgement. Some people are going to say what their family wants them to hear and only open up to professionals or a neutral third party.

> We'd need the story told by the children-turned-adults to make any fair judgement.

True enough. Of course, you are not going to get that in this case. All I can say is that those commenting here about potentially cataclysmic consequences are likely precisely the kind of people who will practice the kind of soft "friend class" parenting that can result in really troubled kids. If they even have kids at all, because some of the comments by others sound infantile.

The other narrative that is utterly false is that of role models in the negative sense. Almost all of you are one or two generations away from a culture and style of parenting where beating the kids was considered normal and even good parenting. An era where teachers beating kids in school was also normal and accepted. And yet, that has largely not survived the generational divide except in some segments of some cultures.

Raising kids and being a role model isn't a matter of single events or experiences, it is, like most other things in the human condition, a matter of building a relationship over time and understanding that life usually is a rollercoaster ride, not a straight-and-flat road.


Thanks for responding, and I don't disagree.

> I explained that trust is one of the most delicate things in the world

> lies to own children about throwing their toys away


I can't tell whether "destroying all your favorite toys" was a clear expectation the kids already had as a possible outcome of their choices. __________

1. Teach children about consequences... by using clear expectations, timely feedback, and proportional responses.

2. Teach children about consequences... by allowing wrongdoing to become a festering mess until it "justifies" some big punishment that comes as deliberate emotional trauma and surprise.

Separately from asking which one is more "effective" at conditioning an immediate behavior, each choice also affects how those kids are going to behave when they are in any position to set and enforce rules. Being a role-model is hard.


its like the food industry blaming parents, sugar like apps/games are designed to be addictive to the point they are act like a drug, stop the drug dealer, not the consumer.

Blaming parents is a bit unwarranted, when on the other end we have business interests driven by perverse incentives of predating on children’s gullibility for their own profit.

When you say “We‘ll try everything” that is simply not true, in particular what we do not try is strict consumer protection laws which prohibits targeting children. Europe used to have such laws in the 1980s and the 1990s, but by the mid-1990s authorities had all but stopped enforcing them.

We have tried consumer protection, and we know it works, but we are not trying it now. And I think there is exactly one reason for that, the tech lobby has an outsized influence on western legislators and regulators, and the tech industry does not want to be regulated.


It is literally the parents responsibility. You want to blame someone else. Raising a kid doesn't mean letting society raise them you have to make tough choices.

If parents can't handle that they can give them up to the state.


It is literally a platform's responsibility to make sure they are being used responsibly, as well?

Imagine a gun range that was well aware that their grounds were being used in nefarious ways. We'd shut it down. A hospital that just blindly gave out pain killers to anyone that asked. We'd shut it down.

Does this mean that a zero tolerance policy is what should be used to shut things down? I don't think so. We have some agency to control things, though.


I am not gonna blame parents while businesses are allowed to target children with ads about the newest mobile game. Children are very easy to influence, and this is exploited heavily by the tech industry, who shower children with advertising. This is predatory behavior, which the legislator and the regulator of western governments (including Europe) has allowed to proliferate.

We cannot expect every parent to be able to protect their children when they are being predated on by dozens of multi-million dollar companies, and the state is on the side of the companies.


> I am not gonna blame parents while businesses are allowed to target children with ads about the newest mobile game.

Those kids shouldn't even have a mobile device to play said game. That's where the parents can, and should, make a difference: don't let your kid even have a smartphone in the first place.


Kids also tend to disobey, and whine about it. Sure you can say parents should be strict and thorough, but you can’t expect 100% of parents (who are often tired from a hard day at work) to be 100% diligent 100% of the time.

And the reason we have these ads is that corporations are hoping that the kids will indeed disobey, and whine constantly at their parents, until they have their way (as directed to by the targeted ad). There was a good reason why targeting kids in ads used to be illegal in Europe.


> "to target children with ads about the newest mobile game"

They aren't. The target for those games are middle aged, "middle class" women. Especially childless women. You just don't realize that the loud sounds and bright colors appeal to another demographic other than children. Usually those games are terrible for (as in the children don't like them) children. Its because those are usually pay to win games and adults can just out-spend them (and the adults are often terrible winners).


>Children are very easy to influence, and this is exploited heavily by the tech industry, who shower children with advertising

The parents' job is to say no. If they're letting themselves be influenced too, that's bad parenting.


Are you a parent? This isn’t bait or some lame attempt at “as a parent…” but it is important for how I construct my response.

And it is the job of the legislator to tell conflicting interests no. If they are influenced by lobby groups, that’s a bad government.

> We'll try everything, it seems, other than holding parents accountable for what their children consume.

You've missed the point. No legislator or politician cares about what the parents are doing.

What they care about is gaining greater control of people's data to then coerce them endlessly (with the assitance of technology) into acting as they would liike. To do that, they need all that info.

"The children" is the sugar on the pill of de-anonymised internet.


Ah, the abstinence theory of protection. How it continues to rear its ugly head.

Why this utter drivel is the top comment is beyond me, unbelievable.


That is not what the post you are replying to is advocating for at all - try reading it one more time without so much hostility

Can you offer some rebuttal to give some credence to your point?

A physical realm that is safe for children to explore in their own is clearly preferable to one where it’s transgressive to let a child go outside without an escort.

It is plausible that the same applies to the digital realm.


I was randomly forced to do this about a year ago, gave them everything except a passport (Tried providing other doc but support is either bots or overseas), got rejected, and lost a 15 year old legitimate business account.

Could never find any explanation why I was targeted by this - it said it detected “suspicious activity” but I only ever interacted with recruiters, and only occasionally. Supposedly it is deleted after if you don’t go all the way through, but I do not believe it. This data ends up in very weird places and they can go fuck themselves for it afaic.


He has testified to congress that IG/meta does not promote sexual content, which is nuts, because anyone who’s spent 5 mins on the platform knows this absolutely not the case

In my experience it’s mostly sexual adjacent content with just enough plausible deniability that you could say it’s a comedic sketch or something. They’re not funny, and the punchline is usually tits, but it has the cosmetic structure of a joke.

I think its just by nature very engaging, as dudes will go look at other posts and comment (at least the older ones) about their looks etc...

Both can be true. IG/Meta does not promote sexual content. Users promote sexual content. That might be subtle but there is a real distinction.

> That might be subtle but there is a real distinction.

A distinction without a difference, as the expression goes


And who controls what user content goes into user feeds?

Sometimes (or often) things with horrible security flaws "work" but not in the way that they should and are exposing you to risk.

If you refuse to run AI generated code for this reason, then you should refuse to run closed source code for the same reason.

I don't see how the two correlate - commercial, closed source software usually have teams of professionals behind them with a vested and shared interest in not shipping crap that will blow up in their customers' face. I don't think the motivations of "guy who vibe coded a shitty app in an afternoon" are the same.

And to answer you more directly, generally, in my professional world, I don't use closed source software often for security reasons, and when I do, it's from major players with oodles of more resources and capital expenditure than "some guy with a credit card paid for a gemini subscription."


> The cool part about pre-AI show HN is you got to talk to someone who had thought about a problem for way longer than you had

Honestly, I agree, but the rash of "check out my vibe coded solution for perceived $problem I have no expertise in whatever and built in an afternoon" and the flurry of domain experts responding like "wtf, no one needs this" is kind of schadenfreude, but I feel guilty a little for enjoying it.


From what I can tell, domain experts mostly don't directly respond like that. They just make separate meta-level commentaries about Show HN getting flooded. Most submissions get little or no response.

I agree with this, and personally I don't even go to the comment section of those posts. What's the point? There is nothing to learn and no one willing to learn anything.

>and the flurry of domain experts responding like "wtf, no one needs this"

People have been saying this about Show HNs for time eternal. There have been an insane number of poorly thought out, poorly considered, often Get-Rich-Quick type of creations, long before AI. Things where the submitter clearly doesn't understand the industry they're targeting, doesn't provide any sort of solution, etc. Really strange if people actually think this is a new phenomenon.

Indeed, a recent video that I rather loved touches on this - https://www.youtube.com/watch?v=Km2bn0HvUwg

Its subject is "Everything was Already AI", the point being that everyone is quantizing and simplifying and reflecting everyone else and the consensus, in such a fashion that people acting like AI ruined everything...yeah, it was already ruined. We already have furry artists drawing furry art just like countless other furry artists, declaring it an outrage that someone used AI to draw furry art, and so on. As the video covers, the whole idea of genres is basically people just cloning each other.

Be right back, going to put on a cowboy hat and denim and sing in a drawl about pickups and exes.


While I agree with your overall point, I think art is quite different.

I don't personally consume furry art but I am a fan of Studio Ghibli and the anime medium in general. And even within that medium, certain artists have a very different style than others. I can usually tell Makoto Shinkai's style vs Hayao Miyazaki's style vs Akira Toriyama's style. I don't think any of them ever claimed to have copied each other. But they have all worked thousands of hours to perfect their craft.

With AI, you get people like me, who can't draw stick figures, tell Chatgpt or nano banana to make an anime version of themselves and then voilà! You get something that could probably pass as Miyazaki's in a minute.

No artist has a claim or monopoly on a genre, but they do have a claim on their own art style. With AI being trained on artists' styles, the artists whose works literally trained the AIs are now being inundated with low effort copycats of their creations.

That being said, I wrote in another thread comment that AI is an accelerator of what already exists. In a codebase, if you have crappy code patterns, AI will just accelerate that.

In business, like you said, people who had crappy ideas have always been able to submit crappy business ideas. Only a few of them actually tried to execute on them. With AI, more of them can execute on them.

I think this "boringness" the article is talking about always existed. It just becomes more prevelant because AI reduces the barrier to entry.


On the whole you're right, but it's also the case that scale matters. Show HNs have always been mostly bullshit, but producing a bullshit Show HN was on the same order of magnitude difficulty of producing a good one. If LLMs were to provide 10x productivity, we'd have the same number of good Show HNs, and 10x more bullshit ones.

> schadenfreude

I’ve been partaking in my fair share, but more and more I’m just feeling sad for my fellow coders ‘cause a lot of what I’m hearing is about bad local choices and burdensome tech stacks.

Sure, it’s kinda hilarious watching a bunch of fashion obsessed front-end devs discover bash, TDD, and that, like, specifications, like, can really be useful, you know, for building stuff or whatever.

But then I think about a version of me who came up a bit later, bit into some reasonable sounding orthodoxy about React or Node as my first production language and who would be having the same ‘profound’ revelations. I never would have learned better. I wouldn’t be as empowered from having these system programming concepts hammered into me. LLMs would be more ‘magic’, I’d extrapolate more readily…

I’ve found myself thinking a lot of thoughts tantamount to “why don’t you dummies just use Haskell, or Lisp, or OCaml, or F#, or Kotlin for that?!”, and from their PoV I’m seeing a broken ladder. A ladder that was orthodoxy and well-documented when I was coming up.

LLMs should ideally bring SICP and Knuth and emacs to the masses. Fingers crossed.


Don't you think their is an opposite of that effect too?

I feel like I can breeze past the easy, time consuming infrastructure phase of projects, and spend MUCH more time getting to high level interesting problems?


I am saying a lot of the time these type of posts are a nonexistent problem, a problem that is already solved, or just thinking about a "problem" that isn't really a problem at all and results from a lack of understanding.

The most recent one I remember commenting on, the poor guy had a project that basically tried to "skip" IaC tools, and his tool basically went nuts in the console (or API, I don't remember) in one account, then exported it all to another account for reasons that didn't make any sense at all. These are already solved problems (in multiple ways) and it seemed like the person just didn't realize terraformer was already an existing, proven tool.

I am not trying to say these things don't allow you to prototype quickly or get tedious, easy stuff out of the way. I'm saying that if you try to solve a problem in a domain that you have no expertise in with these tools and show other experts your work, they may chuckle at what you tried to do because it sometimes does look very silly.


I'm building an education platform. 95% is vibe coded. What isn't vibe coded though is the content. AI is really uninspiring with how to teach technical subjects. Also, the full UX? I do that. Marketing plan? 90% is me.

But AI does the code. Well... usually.

People call my project creative. Some are actually using it.

I feel many technical things aren't really technical things they are simply a problem where "have a web app" is part of the solution but the real part of the solution is in the content and the interaction design of it, not in how you solved the challenge technically.


> or just thinking about a "problem" that isn't really a problem at all and results from a lack of understanding

You might be on to something. Maybe its self-selection (as in people who want to engage deeply with a certain topic but lack domain expertise might be more likely to go for "vibecodable" solutions).


I compare it to a project I worked on when I was very junior a very long time ago - I built by hand this complicated harness of scripts to deploy VM's on bare metal and do stuff like create customizable, on-the-fly test environments for the devs on my team. It worked fine, but it was a massive time sink, lots of code, and was extremely difficult to maintain and could have weird behavior or bad assumptions quite often.

I made it because at that point in my career I simply didn't know that ansible existed, or cloud solutions that were very cheap to do the same thing. I spent a crazy amount of effort doing something that ansible probably could have done for me in an afternoon. That's what sometimes these projects feel like to me. It's kind of like a solution looking for a problem a lot of the time.

I just scanned through the front page of the show HN page and quickly eyeballed several of these type of things.


Yeah, the feeling that hits when you finally realize you spent THIS MUCH EFFORT in a problem and you realize you can do more with less.

> I made it because at that point in my career I simply didn't know that ansible existed

Channels Mark Twain. "Sorry for such a long letter, i didn't have the time to make it shorter."


This is why I make it a goal to have a very good knowledge about the tools I use. So many problems, can be solved by piping a few unix tools together or have a whole chapter in the docs (emacs, vim, postgres,…) about how to solve it.

I write software when the scripts are no longer suitable.


I've read opinions in the same vein of what you said, except painting this as a good outcome. The gist of the argument is why spend time looking for the right tool and effort learning its uses when you can tell an agent to work out the "problem" for you and spit out a tailored solution.

It's about being oblivious, I suppose. Not too different to claiming there will be no need to write new fiction when an LLM will write the work you want to read by request.


It's a reasonable question - I would probably answer, having shipped some of these naive solutions before, that you'll find out later it doesn't do entirely what you wished, is very difficult/impossible to maintain, has severe flaws you're unable to be aware of because you lack the domain expertise, or the worst in my opinion, becomes completely unable to adapt to new features you need with it, where as the more mature solutions most likely already had spent considerable amount of time thinking about these things.

I was dabbling in consulting infrastructure for a bit, often prospects would come to me with stuff like this "well I'll just have AI do it" and my response has been "ok, do that, but do keep me in mind if that becomes very difficult a year or two down the road." I haven't yet followed up with any of them to see how they are doing, but some of the ideas I heard were just absolute insanity to me.


we need a stackoverflow "dupe" structure, something meme-worthy

I do believe you, but I have to ask: what are these incredibly tedious "easy, time consuming parts of projects" everyone seems to bring up? Refactoring I can see, but I have a sense that's not what you mean here.

That's actually a great point. I feel like unless you know for sure that you will never need something again, nothing is disposable. I find myself diving into places I thought I would never care about again ALL the time.

Every single time I have vibe coded a project I cared about, letting the AI rip with mild code review and rigorous testing has bit me in the ass, without fail. It doesn't extend it in the taste that I want, things are clearly spiraling out of control, etc. Just satisfying some specs at the time of creation isn't enough. These things evolve, they're a living being.


In a simple text based game I'm vibe coding for fun, I created skills that help the specs evolve.

I started with chatgpt, I told it to make me a road map of game features.

Then I use that road map to guide my LLM (I use codex 5.3), with the specification — when working on tasks, if you learn anything that may be out of scope, add it to the road map.

There's a bit more to it than that, but so far I've got a playable game, and at some point the requirement of adding an admin dashboard for experiments got added to the road map, and that got implemented pretty well too.

At first I did review a lot of its code, but now I just let it rip and I've been happy with it thus far.

At work I use AI heavily but obviously since I'm responsible for whatever code I push I do actually review and test and understand, but mostly I just need to tweak some small things before it's good enough to ship.


For me, the answer to this question is: parts that involve no architectural decisions, and that won't need to be extended or built upon significantly in the future.

When I'm working on a greenfield project that I intend to build out further (which is what I am currently doing), I find that there's not a lot of work that fits those criteria. I expect that can change drastically when you're working on something that is either more mature, or more narrowly scoped (and thus won't need to be extended too much, meaning poor architectural decisions are not a big issue).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: