Look, fitting a single metric to a curve and projecting from that only gets you a "model" that conforms to your curve fitting.
"proper" AI, where it starts to remove 10-15% of jobs will cause an economic blood bath.
The current rate of AI expansion requires almost exponential amounts of cash injections. That cash comes from petro-dollars and advertising sales. (and the ability of investment banks to print money based on those investment) Those sources of cash require a functioning world economy.
given that the US economy is three fox news headlines away from collapse[1] exponential money supply looks a bit dicey
If you, in the space of 2 years remove 10-15% of all jobs, you will spark revolutions. This will cause loands to be called in, banks to fail and the dollar, presently run obvious dipshits, to evaporate.
This will stop investment in AI, which means no exponential growth.
Sure you can talk about universal credit, but unless something radical changes, the people who run our economies will not consent to giving away cash to the plebs.
AI 2027 is unmitigated bullshit, but with graphs, so people think there is a science to it.
[1] trump needs a "good" economy. If the fed, who are currently mostly independent need to raise interest rates, and fox news doesn't like it, then trump will remove it's independence. This will really raise the chance of the dollar being dumped for something else (and its either the euro or renminbi, but more likely the latter)
That'll also kill the UK because for some reason we hold ~1.2 times our GDP in US short term bonds.
TLDR: you need an exponential supply of cash for AI 2027 to even be close to working.
I disagree with the forecast too, but your critique is off-base. The assumption that exponential cash is required assumes that subexponential capex can't chug along gradually without the industry collapsing into mass bankruptcy. Additionally, the investment cash that the likes of Softbank are throwing away comes from private holdings like pensions and has little to nothing to do with the sovereign holdings of OPEC+ nations.
The reason that it doesn't hold water are the bottlenecks on compute production. TSMC is still the only supplier of anything useful for foundation model training and their expansions only appear big and/or fast if you read the likes of Forbes.
> cash that the likes of Softbank are throwing away comes from private holding
Softbank use debt to leverage various things. They require a functioning monetary system to work. They cannot work as they are in a financial recession. Sure some of it might pensions, but again that drys up when the financial system freezes.
Money doesn't exist in a vacuum. Its not some fixed supply of things. its a dynamic system that grows and contracts based largely on vibes.
It's certainly hard to imagine the political situation in the US resulting in UBI anytime soon, while at the same time the party in control wants unregulated AI development for the next decade.
> AI 2027 is unmitigated bullshit, but with graphs, so people think there is a science to it.
AI 2027 is classic Rationalist/LessWrong/AI Doomer Motte-Bailey - it's a science fiction story that pretends to be rigorous and predictive but in such a way that when you point out it's neither, the authors can fall back to "it's just a story".
At first I was surprised at how much traction this thing got, but this is the type of argument that community has been refining for decades and this point, and it's pretty effective on people who lack the antibodies for it.
I'm very much an AI doomer myself, and even I don't think AI 2027 holds water. I find myself quite confused about what its proponents (including Scott Alexander) are even expecting to get from the project, because it seems to me like the median result will be a big loss of AI-doomer credibilty in 2028 when the talking point shifts to "but it's a long tailed prediction!"
Same here. I ask the reader not to react to AI 2027 by dismissing the possibility that it is quite dangerous to let the AI labs continue with their labbing.
This is feeling like a retread of climate change messaging. Serious problem requiring serious thought (even without “AI doom” as the scenario, just the political economic and social disruptions suffice) but being most loudly championed via aggressive timelines and significant exaggerations.
The overreaction (on both sides) to be followed by fatigue and disinterest.
Because if we're unlucky, Scott will think in the final seconds of his life as he watches the world burn "I could have tried harder and worried less about my reputation".
I don't think it's a matter of being worried about reputation. Making credible predictions and rigorous analysis is important in all scenarios. If superintelligence really strikes in 2027, I feel like AI 2027 would be right only by coincidence, and would probably only have detracted from safety engineering efforts in the process.
It got traction because it supported everyone’s position in some way:
* Pro-safety folks could point at it and say this is why AI development should slow down or stop
* LLM-doomer folks (disclaimer: it me) can point at it and mock its pie-in-the-sky charts and milestones, as well as its handwashing of any actual issues LLMs have at present, or even just mock the persistent BS nonsense of “AI will eliminate jobs but the economy [built atop consumer spending] will grow exponentially forever so it’ll be fine” that’s so often spewed like sewage
* AI boosters and accelerationists can point to it as why we should speed ahead even faster, because you see, everyone will likely be fine in the end and you can totes trust us to slow down and behave safely at the right moment, swearsies
Good fiction always tickles the brain across multiple positions and knowledge domains, and AI 2027 was no different. It’s a parable warning about the extreme dangers of AI, but fails to mention how immediate they are (such as already being deployed to Kamikaze drones) and ultimately wraps it all up as akin to a coin toss between an American or Chinese Empire. It makes a lot of assumptions to sell its particular narrative, to serve its own agenda.
It’s not just changing economics that will derail the projections. The story gives them enough compute and intelligence to massively sway public opinion and elections, but then seems to just assume the world will just keep working the same way on those fronts. They think ASI will be invented, but 60% of the public will disapprove; I guess a successful PR campaign is too difficult for the “country of geniuses in a datacenter”?
Look, fitting a single metric to a curve and projecting from that only gets you a "model" that conforms to your curve fitting.
"proper" AI, where it starts to remove 10-15% of jobs will cause an economic blood bath.
The current rate of AI expansion requires almost exponential amounts of cash injections. That cash comes from petro-dollars and advertising sales. (and the ability of investment banks to print money based on those investment) Those sources of cash require a functioning world economy.
given that the US economy is three fox news headlines away from collapse[1] exponential money supply looks a bit dicey
If you, in the space of 2 years remove 10-15% of all jobs, you will spark revolutions. This will cause loands to be called in, banks to fail and the dollar, presently run obvious dipshits, to evaporate.
This will stop investment in AI, which means no exponential growth.
Sure you can talk about universal credit, but unless something radical changes, the people who run our economies will not consent to giving away cash to the plebs.
AI 2027 is unmitigated bullshit, but with graphs, so people think there is a science to it.
[1] trump needs a "good" economy. If the fed, who are currently mostly independent need to raise interest rates, and fox news doesn't like it, then trump will remove it's independence. This will really raise the chance of the dollar being dumped for something else (and its either the euro or renminbi, but more likely the latter)
That'll also kill the UK because for some reason we hold ~1.2 times our GDP in US short term bonds.
TLDR: you need an exponential supply of cash for AI 2027 to even be close to working.