You might want to ask ChatGPT what that is referencing. Specifically, Steve Jobs telling everyone it was their fault that Apple put the antenna right where people hold their phones and it was their fault they had bad reception.
The issue is really that LLMs are impossible to deterministically control, and no one has any real advice on how to deterministically get what you want from them.
I recognized the reference. I just don’t think it applies here.
The iPhone antenna issue was a design flaw. It’s not reasonable to tell people to hold a phone in a certain way. Most phones are built without a similar flaw.
LLMs are of course nondeterministic. That doesn’t mean they can’t be useful tools. And there isn’t a clear solution similar to how there was a clear solution to the iPhone problem.
"Antennagate" is the gift that keeps on giving. Great litmus test for pundits and online peanut galleries.
You are technically correct: it was a design flaw.
But folks usually incorrectly trot it out as an example of a manufacturer who arrogantly blamed users for a major product flaw.
The reality is that that essentially nobody experienced this issue in real life. The problem disappeared as long as you used a cell phone case. Which is how 99.99% of people use their phones. To experience the issue in real life you had to use the phone "naked", hold it a certain way, and have slightly spotty reception to begin with.
So when people incorrectly trot this one out I can just ignore the rest of what they're saying...
The issue is really that LLMs are impossible to deterministically control, and no one has any real advice on how to deterministically get what you want from them.