The lack of tech literacy in this article is a bit concerning:
>Some researchers take this so seriously they won’t work on planes, coffee shops or anyplace where someone could peer over their shoulder and catch a glimpse of their work.
I'm almost certain that originally this was meant to be a reference to public wifi networks, as planes and coffee shops are often the frequently cited prototypical examples. They made it literally into a matter of someone looking over their shoulder, which loses so much in translation it's almost how you would write this as a joke to illustrate someone missing the point.
>OpenAI and its brash chief executive, Sam Altman
This also strikes me as nonsense. It's the first I've ever heard of someone describing Sam Altman as brash. The only way I can see them getting there is (1) tech executives are often brash (2) Altman is a tech executive (3) let's just go ahead and call him brash.
Nevertheless if this history of GPT5 and/or o3 training is accurate, it strikes me as significant news, but perhaps a missed opportunity to say more about the pertinent dynamics that explain why the training isn't working and/or to talk in interestingly specific ways about strategies for training, synthetic data, or other such things.
>Some researchers take this so seriously they won’t work on planes, coffee shops or anyplace where someone could peer over their shoulder and catch a glimpse of their work.
I'm almost certain that originally this was meant to be a reference to public wifi networks, as planes and coffee shops are often the frequently cited prototypical examples. They made it literally into a matter of someone looking over their shoulder, which loses so much in translation it's almost how you would write this as a joke to illustrate someone missing the point.
>OpenAI and its brash chief executive, Sam Altman
This also strikes me as nonsense. It's the first I've ever heard of someone describing Sam Altman as brash. The only way I can see them getting there is (1) tech executives are often brash (2) Altman is a tech executive (3) let's just go ahead and call him brash.
Nevertheless if this history of GPT5 and/or o3 training is accurate, it strikes me as significant news, but perhaps a missed opportunity to say more about the pertinent dynamics that explain why the training isn't working and/or to talk in interestingly specific ways about strategies for training, synthetic data, or other such things.