> "Once AI can reliably translate fuzzy natural language into precise and accurate code, software development will simply die as a profession."
One-shotting anything like this is a non-starter for any remotely complex task. The reason is that fuzzy language is ambiguous and poorly defined. So even in this scenario you enter into a domain where it's going to require iterative cycling and refinement. And I'm not even considering the endless meta-factors that further complicate this, like performance considerations depending on how you plan to deploy.
And even if language were perfectly well defined, you'd end up with 'prompts' that would essentially be source codes in their own right. I have a friend who is rather smart, but not a tech type - and he's currently working on developing a very simple project using LLMs, but it's still a "real" project in that there are certain edge cases you need to consider, various cross-functionality in the UI that needs to be carried out, interactions with some underlying systems, and so on.
His 'prompt' is gradually turning into just a natural language program, of comparable length and complexity. And with the amount of credits he's churning through making it, in the end he may well have been much better off just hiring some programmers on one of those 'gig programming' sites.
------
And beyond all of this, even if you can surmount these issues - which I think may be inherently impossible - you have another one. The reason people hire software devs is not because they can't do it themselves, but because they want to devote their attention to other things. E.g. - most of everybody could do janitorial work, yet companies still hire millions of janitors. So the 'worst case' scenario would be that you dramatically lower the barriers to entry to software development, and wages plummet accordingly.
One-shotting anything like this is a non-starter for any remotely complex task. The reason is that fuzzy language is ambiguous and poorly defined. So even in this scenario you enter into a domain where it's going to require iterative cycling and refinement. And I'm not even considering the endless meta-factors that further complicate this, like performance considerations depending on how you plan to deploy.
And even if language were perfectly well defined, you'd end up with 'prompts' that would essentially be source codes in their own right. I have a friend who is rather smart, but not a tech type - and he's currently working on developing a very simple project using LLMs, but it's still a "real" project in that there are certain edge cases you need to consider, various cross-functionality in the UI that needs to be carried out, interactions with some underlying systems, and so on.
His 'prompt' is gradually turning into just a natural language program, of comparable length and complexity. And with the amount of credits he's churning through making it, in the end he may well have been much better off just hiring some programmers on one of those 'gig programming' sites.
------
And beyond all of this, even if you can surmount these issues - which I think may be inherently impossible - you have another one. The reason people hire software devs is not because they can't do it themselves, but because they want to devote their attention to other things. E.g. - most of everybody could do janitorial work, yet companies still hire millions of janitors. So the 'worst case' scenario would be that you dramatically lower the barriers to entry to software development, and wages plummet accordingly.