Enough time for LLMs to see marked improvement and possibly the hallucination issue to be significantly reduced or solved.
When then happens the remaining software engineers won’t be skilled. Like many people in the principle or staff position or managerial position… technical skills don’t matter. AI now handles technical skills and the people controlling the AI are people who can navigate the politics.
If ai handles all the technical skills what’s left?
Like you made a claim here with no logical reasoning as to why.
My reasoning is simple if AI handles all technical skills then no technical skills are required for the job anymore. Where’s your reasoning?
In addition to this “unqualified hand waving”. Look at your own baseless statement. At least I qualified my statement with “it’s possible”.
If LLMs solve the hallucination issue then it’s stuff like this that will sit at the top of the hierarchy. People who make grand claims with confidence and play politics to get to the top. You say my statement is handwavy but really that’s another way of saying “I don’t need to prove you wrong I’m just going to make a baseless claim about how your statement is utter crap but do it in a way as to subvert HN politeness rules, i can say it in a way that doesn’t raise any eyebrows and people who already agree with me will automatically take my side even though I didn’t present any additional new reasoning here”.
This is what I’m talking about. It’s people who are good at strategies like this who will occupy the top spots in the future should LLMs continue to improve. Case in point. If karma was voting for the next person in a leadership position you would win.
For now, the hallucination problem is objectively only getting worse with reasoning models, and there is no solution in sight.
Technical skills will matter for the simple reason that someone will always need to supervise the output of the AI. A non-technical person can't hope to do so too well. Without this supervision, the AI's work product will have skeletons that will cause unexpected issues due to outliers. Reality is full of outliers - it's what pays half the salary.
Imagine asking an AI to design an airplane. The design passes all test flights and also software tests. Would you want to fly in it without an expert human having reviewed the ins and outs of the design?
How about a CT scan machine? Would you want to risk the 10x radiation due to a hypothetical implementation error that strikes in 1 out of 10K cases?
> For now, the hallucination problem is objectively only getting worse with reasoning models, and there is no solution in sight.
You pulled this statement out of your ass. Objectively? We have baseline quantitative tests that say the opposite? LLMS are doing better on tests and it’s been improving. Where did your “objective” statement come from? Anecdotes? Quotations?
> Technical skills will matter for the simple reason that someone will always need to supervise the output of the AI.
Humans will never be so stupid as to lose all technical skill. In the beginning there needs to be mild technical skill at most so the human can somewhat understand what the AI is doing. But as trust grows the human will understand less and less of it. The human is like a desperate micro manager attempting to cling to understanding but that inevitably eventually erodes completely.
For commentary on the hallucination problem, please see https://news.ycombinator.com/item?id=43942800 and the associated news link. I have observed it myself in practice with o3 compared to o1.
> If AI handles all the technical skills what’s left?
This is a view utterly out of touch with reality. AI handling all technical skills? And who defines where technical skills start and end?
Will AI development/innovation stop in the future? And if not, will the engineers working on AI not be applying technical skills? Will AI eliminate the technical skill of plumbing? Of doctors? What about systems analysis and architectural design, a technical skill. Will AI read the minds of people, anticipate their every tech need, and eliminate that too?
Maybe some have a weird understanding of the "technical". Its meaning is much broader that they think.
Again how is it out of touch with reality? We have memes about vibe coders who don’t understand shit. Why do those memes exist? Because that’s where the trend-lines of technology are pointing. Whether we get there is another story. But denying the trendline is irrational.
Oh I just read your second paragraph. I’m obviously only talking about coding. Blue collar jobs that involves a lot of manual skill is less of an issue. Robotics has an upward trendline but it’s nowhere near moving at the breakneck pace of AI.
Enough time for LLMs to see marked improvement and possibly the hallucination issue to be significantly reduced or solved.
When then happens the remaining software engineers won’t be skilled. Like many people in the principle or staff position or managerial position… technical skills don’t matter. AI now handles technical skills and the people controlling the AI are people who can navigate the politics.