Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

OpenAI most likely spent 6 or even 7 figures to produce GPT-2 model. The guys who wrote this article mentioned they spent $500k to replicate OpenAI’s results. Each training run costs $50k, and they needed many runs to find the optimal config. Note that the GPT-2 paper had already described many crucial details (eg dataset). Without those details it would be a lot harder to do this.


Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: