This article looks like a case of skating to where the puck is. Over the next 2-4 years this will change - the rate of improvement in AI is staggering and these tools are in their infancy.
I would not be confident betting a career on any those patterns holding. It is like people hand-optimising their assembly back in the day. At some point the compilers get good enough that the skill is a curio rather than an economic edge.
There was a step change over the last few years but that rate of improvement is not continuing. The currently known techniques seem to have hit a plateau. It's impossible to know when the next "Attention is All You Need" will materialize. It could be in 2025 or 2035.
o1 sort of techniques (tree search, test-time training type things) have not hit any recognizable plateau. There's still low hanging fruits all around.
We said the same thing 3 years ago and we still have errors on basic questions. I don’t know where people get their estimation from ? Their intuition ?
I think the tech chauvinism (aka accelerationism) comes from the crypto-hype era and unfortunately has been merged into the culture wars making reasonable discussion impossible in many cases.
Yes that was exactly my point. For AI to get there ? Sure. But how do they throw out a specific time prediction ? 2-3 years is specific. I mean it’s so specific that companies could make strategic decisions to incorporate it faster and there is a huge price to pay if it reveals itself not to be as trustworthy and bug free as much as we hoped and that could be a huge problem for the economy, for companies needlessly dealing with problems that cost money and time to solve. If people said « it’s amazing now and in the next decade it will be production ready and could be used with trust » then it casts a different outlook and different strategies will be taken. But because of the specific and close estimates everything changes even if every 3 years for the next 10 years they say it again. So yeah eventually we’ll get there one day
I would not be confident betting a career on any those patterns holding. It is like people hand-optimising their assembly back in the day. At some point the compilers get good enough that the skill is a curio rather than an economic edge.