Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Unless the quality of the human data are extraordinary, it seems according to the TFA that it's not that easy:

> The process is painfully slow. GPT-4 was trained on an estimated 13 trillion tokens. A thousand people writing 5,000 words a day would take months to produce a billion tokens.

And if the human-generated data was so qualitatively good that it is smaller by three order of magnitudes, than I can assume it would be at least as expensive as o3.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: