Hacker News new | past | comments | ask | show | jobs | submit login

>s a smokescreen to distract people from thinking clearly about the more pedestrian hazards of AI that isn't self-improving or superhuman,

Anything that can't be self-improving or superhuman almost certainly isn't worthy of the moniker "AI". A true AI will be born into a world that has already unlocked the principles of intelligence. Humans in that world would be capable themselves of improving AI (slowly), but the AI itself will (presumably) run on silicon and be a quick thinker. It will be able to self-improve, rapidly at first, and then more rapidly as its increased intelligence allows for even quicker rates of improvement. And if not superhuman initially, it would soon become so.

We don't even have anything resembling real AI at the moment. Generative models are probably some blind alley.






> We don't even have anything resembling real AI at the moment. Generative models are probably some blind alley.

I think that the OP's point was that it doesn't matter whether it's "real AI" or not. Even if it's just a glorified auto-correct system, it's one that has the clear potential to overturn our information/communication systems and our assumptions about individuals' economic value.


If that has the potential to ruin economies, then the economic rot is so much more profound than anyone (me included) ever realized.

I think when the GP says "our assumptions about individuals' economic value." they mean half the workforce becoming unemployed because the auto corrector can do it cheaper.

That's going to be a swift kick to your economy, no matter how strong.


If the "autocorrector" can do anything that used to be a paid wage/salary "cheaper", this isn't the case of another job being automated away. Either something is grossly wrong in that some are willing to have a stupid machine do that poorly (writing Sports Illustrated articles), or something was always grossly wrong in that we were paying someone to do something that never did matter (paying someone to write SEO spam). This isn't a swift kick in the economy, this is a "your economy was so weak a soft breeze knocked it over".

> Either something is grossly wrong in that some are willing to have a stupid machine do that poorly (writing Sports Illustrated articles), or something was always grossly wrong in that we were paying someone to do something that never did matter

I recently had an LLM write a function for me that, for a given RGB color value and another integer n > 1, returned to me a list of n RGB colors equidistantly and sequentially spaced around the color wheel starting at the specified RGB value.

For a given system I'm creating, I might have lots of such tasks. That collection of tasks is something that did matter and took some education and skill to complete well.

In the pre-LLM world - assuming I was too busy to handle all the tasks myself - I would have delegated them to a junior software engineer.

In a post-LLM world, I just ask the LLM to implement tasks like that, and I review the code for correctness.

That seems like a pretty transformational change to me, and not just some kind of "rot" being removed from the process.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: