Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Reminds me of critisms of python decades ago. that you wouldn't understand what the "real code" was doing since you were using a scripting language. But then over the years it showed tremendous value and many unicorns were built by focusing on higher level details and not lower level code




Comparing LLMs to programming languages is a fake equivalence. I don’t have to write assembly because LLVM will do that for me correctly in 100% of the cases, while AI might or might not (especially the more I move away from template crud apps)

it might be functionally correct but if you wrote it yourself it could be orders of magnitude faster

We've been on the Electron era long enough to know that developer time is more expensive than CPU time.

That is a myth, cpu time is time spent waiting around by your users as the cpu is taking seconds to do something that could be instant, if you have millions of users and that happens every day that quickly adds up to many years worth of time.

It might be true if you just look at development cost, but if you look at value as a whole it isn't. And even just development cost its often not true, since time spent waiting around by the developer for tests to run and things to start also slows things down, taking a bit of time there to reduce cpu time is well worth it just to get things done faster.


Yeah, it's time spent by the users. Maybe it's an inneficiency of the market because the software company doesn't feel the negative effect enough, maybe it really is cheaper in aggregate that doing 3 different native apps in C++. But if CPU time is so valuable, why aren't we arguing for hand written C or even assembly code instead of the layers upon layers of abstraction in even native modern software?

Also, why


> But if CPU time is so valuable, why aren't we arguing for hand written C or even assembly code instead of the layers upon layers of abstraction

Maybe we should, all it took was Figma taking it seriously and working at a lower level to make every other competitor feel awful and clunky next to it then it went on to dominate the market.


> But if CPU time is so valuable, why aren't we arguing for hand written C or even assembly code instead of the layers upon layers of abstraction in even native modern software?

Many of us do frequently argue for something similar. Take a look at Casey Muratori’s performance aware programming series if you care about the arguments.


> But if CPU time is so valuable, why aren't we arguing for hand written C or even assembly code instead of the layers upon layers of abstraction in even native modern software?

That is an extreme case though, I didn't mean that all optimizations are always worth it, but if we look at marginal value gained from optimizations today the payback is usually massive.

It isn't done enough since managers tend to undervalue user and developer time. But users don't undervalue user time, if your program wastes their time many users will stop using it, users are pretty rational about that aspect and prefer faster products or sites unless they are very lacking. If a website is slow a few times in a row I start looking for alternatives, and data says most users do that.

I even stopped my JetBrains subscription since the editor got so much slower in an update, so I just use the one I can keep forever as I don't want their patched editor. If it didn't get slower I'd gladly keep it as I liked some of the new features, but it being slower was enough to make me go back.

Also, while managers can obvious agree that making developer spend less time waiting is a good thing, it is very rare for managers to tell you to optimize compilation times or such, and pretty simple optimizations there can often make that part of the work massively faster. Like, if you profile your C++ compiler and look what files it spends time compiling, then look at those files to figure out why its so slow there, you can find these weird things and fixing those speeds it up 10x, so what took 30 seconds now takes 3 seconds, that is obviously very helpful and if you are used to that sort of thing you could do it in a couple of hours.


No, I wouldn't. That would require me be proficient in this, and I am not, so I am pretty sure I would not get to write better assembly optimisations unless I actually became better in that.

The difference is that there is no point (I know or would encounter) in which a compiler would not actually be able to do the job, and I would need to write manual assembly to fix some parts that the compiler could not compile. Yes a proficient programmer could probably do that to optimise the code, but the code would run and do the job regardless. That is not the case for LLMs, there is a non-zero changeyou get to the point of LLM agents getting stuck and it makes more sense to get hands dirty than iterating with agents.


That's not the same thing. LLMs don't just obscure low-level technical implementation details like Python does, they also obscure your business logic and many of its edge cases.

Letting a Python interpreter manage your memory is one thing because it's usually irrelevant, but you can't say the same thing about business logic. Encoding those precise rules and considering all of the gnarly real-world edge cases is what defines your software.

There are no "higher level details" in software development, those are in the domain of different jobs like project managers or analysts. Once AI can reliably translate fuzzy natural language into precise and accurate code, software development will simply die as a profession. Our jobs won't morph into something different - this is our job.


>There are no "higher level details" in software development, those are in the domain of different jobs like project managers or analysts. Once AI can reliably translate fuzzy natural language into precise and accurate code, software development will simply die as a profession. Our jobs won't morph into something different - this is our job.

I'm the non-software type of Engineer. I've always kind of viewed code as a way to bridge mathematics and control logic.

When I was at university I was required to take a first year course called "Introduction to Programming and Algorithms". It essentially taught us how to think about problem solving from a computer programming perspective. One example I still remember from the course was learning how you can use a computer solve something like Newton's Method.

I don't really hear a lot of software people talk about Algorithms but for me that is where the real power of programming lives. I can see some idealized future where you write programs just by mix and matching algorithms and almost every problem becomes essentially a state machine. To move from state A to State B I apply these transformations which map to these well known algorithms. I could see an AI being capable of that sort of pattern matching.


the hard thing is to define what State A and State B means Also to prepare for State C and D, so that it doesn’t cost more to add to the mix. And to find that State E everyone is failing to mention,…

> "Once AI can reliably translate fuzzy natural language into precise and accurate code, software development will simply die as a profession."

One-shotting anything like this is a non-starter for any remotely complex task. The reason is that fuzzy language is ambiguous and poorly defined. So even in this scenario you enter into a domain where it's going to require iterative cycling and refinement. And I'm not even considering the endless meta-factors that further complicate this, like performance considerations depending on how you plan to deploy.

And even if language were perfectly well defined, you'd end up with 'prompts' that would essentially be source codes in their own right. I have a friend who is rather smart, but not a tech type - and he's currently working on developing a very simple project using LLMs, but it's still a "real" project in that there are certain edge cases you need to consider, various cross-functionality in the UI that needs to be carried out, interactions with some underlying systems, and so on.

His 'prompt' is gradually turning into just a natural language program, of comparable length and complexity. And with the amount of credits he's churning through making it, in the end he may well have been much better off just hiring some programmers on one of those 'gig programming' sites.

------

And beyond all of this, even if you can surmount these issues - which I think may be inherently impossible - you have another one. The reason people hire software devs is not because they can't do it themselves, but because they want to devote their attention to other things. E.g. - most of everybody could do janitorial work, yet companies still hire millions of janitors. So the 'worst case' scenario would be that you dramatically lower the barriers to entry to software development, and wages plummet accordingly.


> Once AI can reliably translate fuzzy natural language into precise and accurate code, software development will simply die as a profession.

The best humans on the planet constantly fail at this so the assumption is that some post LLM AGI will do this?


But working with AI isn’t really a higher level of abstraction. It’s a completely different process. I’m not hating on it, I love LLMs and use em constantly, but it doesn’t go assembly > C > python > LLMs

It would be a higher level of abstraction if there wouldn't be a need to handhold LLMs. You'd just let one agent build the UI, another the backend, just like a human would (you wouldn't validate their entire body of work, including their testing, documentation).

At that point yeah, a project manager would be able to build everything.

We are not there.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: