I read more sceptical takes about AI on Hacker News than anywhere else (since I stopped following Gary Marcus, at least). My hunch is that some people here might feel professionally threatened about it so they want to diminish it. This is less of an issue with some of the 'normies' that I know. For them AI is not professionally threatening but use it to translate stuff, ideate about cupcake recipes, use it as a psychologist (please don't shoot the messenger) or help them lesson plan to teach kids.
> My hunch is that some people here might feel professionally threatened about it so they want to diminish it.
I don't think it's this. At least, I don't see a lot of that. What I do see a lot of is people realizing that AI is massively overhyped, and a lot of companies are capitalizing on that.
Until/unless it moves on from the hype cycle, it's hard to take it that seriously.
Speaking as a software engineer, I'm not at all threatened by it. I like Copilot as fancy autocomplete when I'm bashing out code, but that's the easy part of my job. The hard part is understanding problems and deciding what to build, and LLMs can't do that and will never be able to do that.
What I am annoyed by is having to tell users and management "no, LLMs can't do that" over and over and over and over and over. There's so much overhype and just flat out lying about capabilities and people buy into it and want to give decision making power to the statistics model that's only right by accident. Which: No.
It's a fun toy to play with and it has some limited uses, but fundamentally it's basically another blockchain: a solution in search of a problem. The set of real world problems where you want a lot of human-like writing but don't need it to be accurate is basically just "autocomplete" and "spam".
I disagree with the characterisation of AI as "another blockchain: a solution in search of a problem". The two industries have opposite problems: crypto people are struggling to create demand, AI people are struggling to keep up with demand.
> crypto people are struggling to create demand, AI people are struggling to keep up with demand.
Today, 10 years ago crypto was what everyone wanted, you can see how bitcoin soared and crypto scams was everywhere and made many billions.
And no AI is not struggling to keep up with user demand, it is struggling to keep up with free but not paid demand. So what you mean is AI is struggling to keep up with investor demand, more people want to invest into AI than there are compute to buy, but that was the same for bitcoins, bitcoin mining massively raised the prices on GPU due to how much investors put into it.
But investor driven demand can disappear really quickly, that is what people mean with an AI bubble.
Google has built a multibillion dollar business on top of "free" users. ChatGPT has more than 400 million weekly active users and this is obviously going to grow. You are overlooking how easily that "free" demand will be monetized as soon as they slap ads on the interface.
"Obviously" is doing a lot of heavy lifting there. They have a lot of competition and no killer use case and no IP rights. From the consumer point of view, they're fungible.
That's not even considering the probability that demand could slow as people lose interest.
Yep. Ten years ago HN was hyping blockchain as the future and there were a million blockchain startups. Just look at the list of startups funded by YC around then lol