Couldn't agree more. Every time these systems get better, there are dozens of comments to the effect of "ya but...[insert something ai isn't great at yet]".
It's a bit maddening to see this happening on a forum full of tech-literate folks.
Ultimately, I think to stay relevant in software development, we are going to have accept that our role in the process could evolve to humans essentially never writing code. Take that one step further and humans may not even be reviewing code.
I am not sure if accepting that is enough to guarantee job security. But I am fairly sure that those who do accept this eventuality will be more relevant for longer than those who prefer to hide behind their "I'm irreplaceable because I'm human" attitude.
If your first instinct is to pick these systems apart and look for things that they aren't doing perfectly, then you aren't seeing the big picture.
Regarding job security, in maybe 10 years (human and companies are slow to adapt), I think this revolution will force us to choose between mostly 2 career paths:
- The product engineer: highly if not completely AI driven. The human supervises it by writing specification and making sure the outcome is correct. A domain expert fluent in AI guidance.
- The tech expert: Maintain and develop systems that can't legally be developed by AI. Will have to stay very sharp and master it's craft. Adopting AI for them won't help in this career path.
If the demand for new products continue to rise, most of us will be in the first category. I think choosing one of these branch early will define whether you will be employed.
That's how I see it. I wish I can stay in the second group.
> - The product engineer: highly if not completely AI driven. The human supervises it by writing specification and making sure the outcome is correct. A domain expert fluent in AI guidance.
If AI continues to improve - what would be the reason a human is needed to verify the correct outcome? If you consider that these things will surpass our ability, then adding a human into the loop would lead to less "correct" outcomes.
> - The tech expert: Maintain and develop systems that can't legally be developed by AI. Will have to stay very sharp and master it's craft. Adopting AI for them won't help in this career path.
This one makes some sense to me but I am not hopeful. Our current suite of models only exist because the creators ignored the law (copyright specifically). I can't imagine they will stop there unless we see significant government intervention.
It's a bit maddening to see this happening on a forum full of tech-literate folks.
Ultimately, I think to stay relevant in software development, we are going to have accept that our role in the process could evolve to humans essentially never writing code. Take that one step further and humans may not even be reviewing code.
I am not sure if accepting that is enough to guarantee job security. But I am fairly sure that those who do accept this eventuality will be more relevant for longer than those who prefer to hide behind their "I'm irreplaceable because I'm human" attitude.
If your first instinct is to pick these systems apart and look for things that they aren't doing perfectly, then you aren't seeing the big picture.