Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've said this before, and I stand by it. I think AI does pose a threat, but not the existential one that leads popular discussion.

Over the next few decades AI is going to take huge numbers of jobs away from humans.

It doesn't need to fully automate a particular role to take jobs away, it just needs to make a human significantly more productive to the point that one human+AI can replace n>1 humans. This is already happening. 20 years ago a supermarket needed 20 cashiers to run 20 tills. Now it needs 2 to oversee 20 self checkouts and maybe 1 or 2 extra for a few regular lanes.

This extra productivity of a single human is not translating to higher wages or more time off, it's translating to more profits for the companies augmenting humans with AI.

We need to start transitioning to an economic model where humans can work less (because AI supplements their productivity) and the individual humans reap the benefits of all this increased AI capability or were going to end up sleepwalking into a world where the majority have been replaced and have no function in society, and the minority of capital owners control the AI, the money and the power.

I wish we could focus on these nearer term problems that have already started instead of the far more distant existential threat of a human/AI war.



I kinda agree, but I think this itself represents an existential threat. Governments have a history of hard-line policies and intolerance when their populations face low wages and declining standards of living. This leads to isolationism, exceptionalism, and FUD. That's how wars start. We now have global communication at light-speed, anyone can say anything to anyone else. Dropping generative AI into that is like pouring jet fuel on a bonfire.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: