Hacker Newsnew | past | comments | ask | show | jobs | submit | muldvarp's commentslogin

AI will fuck over anyone that works for a living. Only the people owning stuff for a living will profit off AI.

Trades aren't safe. Even if nobody figures out how to automate trades, the amount of people that will go into trades when white collar jobs are automated away will drive wages down.

AI will be the worst thing to happen to society in a very long time.


> AI will be the worst thing to happen to society in a very long time.

Maybe. Keep going back and forth.

On one hand I might loose my job. On then other hand everybody might loose their job.

Ai is tricky. If we have a singularity event maybe one or two combines might take all the jobs overnight. Fine. But economies are weird. Once those jobs are automated and nobody has a job we probably won’t even need the jobs that were replaced.

Like today. We have jobs because some other thing came along and “made something easy”. Think about how many jobs we have simply because we as humans write bad software. If this goes away it’s not even about automation taking jobs, it’s about simply not needing huge swaths of jobs at all.

So I think about this and ponder. I’d all Job are basically worthless, then the “rich” people like to complain about, won’t be rich. They won’t have anything either. Simply put, nobody will have any money to buy things and thus the “rich” won’t have anybody to buy things to keep them rich.

So I think more. It’s really not an advantage for the rich and powerful to basically destroy what makes them rich.

For people to be rich they have to have a bunch of people to extract small amounts of money from. A starving and angry population is not going to be a fun place to live for anybody.


I actually think the last point isn’t exactly true. I mean certainly in aggregate, but not at the margins. If the new tech billionaire elite just want to get to space, then they just need enough people to mine a some minerals and metal and make their space ships. If you control all the AI bots you can make money that way.

Kinda the point being, a small number of companies could control all the resources and a few of the people and be rich that way. Yes you’re right, maybe the Walton’s and the other families who made their money on people having money won’t go away, but in theory you could have a group of super rich people just giving money to each other to build space ships and nuclear power plants and the like.


Yeah, and as a tradie, your services will be paid for by wealthy white collar workers, like a guy who just moved into a bigger house he paid for with his cushy IT job, and wants a top-of-the-line HVAC system installed.

If the guy isn't making good money, he won't be hiring you either.


I genuinely feel like I got bait-and-switched by computer science. If I could go back and study something different I would do it in a heartbeat.

Sadly, there's very little I can do now. I don't have the financial means to meaningfully change careers now. Pretty much the only thing I could do now that pays somewhat well and doesn't require me to go to university again is teaching. I think I will ride this one out and end it when it ends.


What if you go back and discover every path you could have taken is a bait-and-switch?

I did like the (short) LLM-free part of my career. The bait-and-switch refers specifically to the changes due to the introduction of LLMs. Any career where LLMs don't play a big role would not have been a bait-and-switch.

That said, I don't understand the point of "what if nothing ever works out for you?"-type questions. What do you expect me to answer here? That I'm secretly a wizard and with the flick of my magic wand I'll make something work out?


The majority of the questions I ask are delivered with the hope that the answer I get is beyond what I might expect. For me, that's sort of the point in asking - fun times exploring together.

I do think everything can be seen as bait and switch if you assume there is someone behind the wheel who knows where we are going and how to drive. If anything, I might have been suspecting we'd both arrive at that point together.

Again, was hoping to be surprised a bit. The wizard bit was kinda fun. Mild thanks, human. I'll just be over here beating this tech over the head for kicks. I wish you well!


For most jobs in any field, having a degree is more important than what the degree is in. University is not a jobs training program, it’s a way to build a foundation. Understanding how systems work together can be applied in many areas of business, not just coding.

> AI mostly doesn’t produce adequate quality or correctness for the type of code I enjoy writing.

This assumes that companies care about "code quality" and customers care about bugs.

> If you find coding boring, explore the frontiers. You will find a lot of coding wilderness where no AI has trod.

There are a lot of software engineers and not a lot of frontier.


LLMs destroying Wikipedia would be incredibly sad and is one of the things that makes me think that LLMs will have a strong negative impact on the lives of most people.

I have previously translated a very small handful of redlink articles into English from another language. Chasing down the sources in the other language and synthesising and cross-referencing with English sources is a fun challenge. To the best of my knowledge, I did an OK job.

While translation tools are a godsend for that, as well as life in general when dealing with a language I am not that good at, LLMs make me increasingly reluctant to do that much more because there is no way I could detect AI slop in a second language. For all I know I'd be translating junk into English and enabling translingual citogenesis.

Bad as the slopwave is for native speakers, it's absolutely brutal for non-native speakers when you can't pick up on the tells. Maybe the gap will narrow and narrow until the slop is stylistically imperceptible.


I think people in the cs bubble will be optimistic up until it directly affects them by destroying software engineering as a (good) career.

that's already happening for Juniors, combination of ai and less funding

It's already happening for all the seniors who got hit by the layoffs ("because AI make code go brrrr") in these past two years, too.

> There is absolutely a market for social media that bans AI slop.

I fully agree, I just don't know how that could work.

I think GenAI will kill the internet as we know it. The smart thing is (and always has been) to be online less and build real connections to real people offline.


There’s an assumption on HN that everyone can identify AI slop because pretty much everyone here can. But my personal experience and what I think might be more in line with reality is that the majority of social media users can’t tell or don’t care.

It's the explicitly stated goal of several of the largest companies on the planet which put up a lot of money to try to reach that goal. And the progress over the past few years has been stunning.

It's a pretty good trick if you don't look too hard at the details. Including how many users have been active for a year vs the weekly counts.

I think there is only a very narrow band where LLMs are good enough at producing software that "hand-coding" is genuinely dead but at the same time bad enough that (expensive) humans still need to be paid to be in the loop.

I think in a few years, we will realize that LLMs have impacted our lives in a deeply negative way. The relatively small improvements LLMs bring to my life will be vastly outweighted by the negatives.

If LLM abilities stagnate around the current level it's not even out of the question that LLMs will negatively impact productivity simply because of all of the AI slop we'll have to deal with.


Hmmm. Interesting prediction. I think even on social media, the consensus is still shaky, and social media is an unalloyed bad IMHO. I think personal cars impacted our lives in a deeply negative way but most people disagree. There is really no consensus on LLMs right now, I think if they stagnate this is where the discourse will stagnate also.

More likely, like other tools, it will be possible to point to clear harms and clear benefits, and people will disagree about the overall impact.


Let's be clear about one thing, though. LLMs are simply tools, which can be used for good or for bad. It's specific people who do and promote the latter - top management of tech companies, for example. It's those people who are creating the negatives in my life and who tank quality for the sake of "productivity". Point the finger and say it out loud. They shouldn't be able to hide behind the tech.

Inference is not that expensive. I'd argue that most models are already useful enough that people will pay to run them.


At $20/month for Claude, I'm satisfied. I'll keep paying that for what I get from it, even if it never improves again.


Of course, but my point is that I don't think it's economically sustainable. If innovation/funding in AI stalls, those $20 will likely skyrocket fast.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: