I keep telling people that this is why AI will turn the industry on its head very soon. It doesn't need to be better than a good developer or designer. Because if you look at the landscape in tech, the vast majority of people simply are not that good. People who are actually objectively good are few and far between and usually very expensive.
If you have an average company that needs to settle for something that works instead of something that is great, you could already replace a ton of jobs with AI. When the C-suite realises this, below average devs are doomed. In the coming years we will probably see the first companies taking off primarily employing AI as coders - long before AIs can beat elite programmers.
Conversations around AI are almost always unclear and poorly framed driven by a profit-hungry hype train.
We're still talking about AI as though the goal was never actually to have artificial intelligence. LLMs are impressive for what they are, but they definitely aren't intelligent. OpenTextPrediction just doesn't have a nice ring to it and definitely wouldn't be valued at billions of dollars.
Whenever I see somebody type super intelligence or AGI, I remember that AI was supposed to be solved by a summer project at Dartmouth in 1956 and I chuckle a bit.
Well that's a great question, and an even more basic complaint I have with the AI research space. They have yet to bother coming up with a clear way to define or recognize intelligence or consciousness.
Those are interesting starting points. I don't know if I'd say its that simply, but Tue direction seems totally reasonable.
The ability to solve problems is a particularly interesting one. To me there's a difference between brute forcing or pattern recognition and truly solving a problem (I don't have a great definition for that!). If that's the case, how do we really recognize which one an LLM or potential AI is doing?
It'd be a huge help if AI researchers put more focus on the interoperability problem before developing systems that could reasonably emerge intelligence.
I'm a terrible developer, most people here wouldnt even consider me one. I have thought about this a lot.
Lets follow this all the way to the bottom: all coding employees are replaced by one very senior one who is AI assisted.
One thought -How are juniors turned into seniors? Lets say that we solve that with some yet to be invented educational solution, and then companies that arent code heavy would hire them for much less money or something like that.
The senior developer always keeps his job, because we cant have non technicals deploying LLM code yet. Then maybe that becomes solved, so your non technical CTO can deploy code.
This then creates an environments where fuck ups are on the CTO for being non technical. The blame aspect is the reason for this, its political.
Or it creates an environment where infrustructure and software becomes a solved problem altogether.
Then we start considering if AI can replace all engineering altogether in many fields. All of commerical writing and commercial creative arts are mostly taken over. Occasionally a brilliant human example moved things in a different direction, but this is quickly fed to the incumbent AIs and then it becomes commercialised.
What happens now? Everyone moves into hardware work?
I think before people would actually be "replaced", the productivity boost might actually cause more work since all of sudden, the development costs go lower which means that new things that were too expensive previously, which there are tons of suddenly become low hanging fruits. I think it's hard to predict what happens after that though.
> One thought -How are juniors turned into seniors? Lets say that we solve that with some yet to be invented educational solution, and then companies that arent code heavy would hire them for much less money or something like that.
In theory there are already many occupations like medicine where you have to study for years before you can do actual work, but coding wise it will still be easier, since people who do it as hobby will do it as hobby and become good enough on their own.
If you define elite programmers in the context of actual coding as those who excel at implementing ideas and solutions, I could imagine that this skill might become less relevant with the advent of AI. Smashing out over 1000 lines of Haskell would then be the equivalent of being able to calculate complex numbers in your head.
However, if you define elite programmers as those who possess good domain knowledge, communication-, management-, and soft skills, then yes, they might become so productive that they could replace developers whose main skill is writing code as we move up a level of abstraction. While it might help today to have a certain level of understanding about Assembly and C, we do not need to be elite at it to be a good software engineer.
I am asking as I met a few devs who are electrical engs. with a very good understanding of how a computer actually works but now earn more with React and Python.
The overlap between both groups are quite large from my personal experience. The mythical code wizard with no domain knowledge or soft skills exist but they are very rare. Most people who are really good at coding are also good at picking up domain knowledge.
Yeah, I would say that elite programmers are the ones who are able to do most value with the tools they have, so likely they would be the ones who would get most out of AI as well, since they know how to make it do what they want and understand if it's any good or not.
If you have an average company that needs to settle for something that works instead of something that is great, you could already replace a ton of jobs with AI. When the C-suite realises this, below average devs are doomed. In the coming years we will probably see the first companies taking off primarily employing AI as coders - long before AIs can beat elite programmers.