Hacker News new | past | comments | ask | show | jobs | submit login

Try to read the code and understand how it works and you will find it very challenging to interpret. But not just that even the documentation is very sparse and hard to read. Compare that to the open-ai code, is so coincise and easy to read. There is mastery in doing that, deep mastery. Few repositories on tensorflow or pytorch organization get to that level.



Agreed re: OpenAI's GPT implementation. It took roughly a year to appreciate how simple it is. https://github.com/openai/gpt-2/blob/0574c5708b094bfa0b0f6df...

Especially compared to StyleGAN, BERT, or pretty much anything else.

I used to hate the OpenAI GPT codebase: zero comments? no classes? What does "mlp" even mean? But over time, I find myself reverting to their style.


Honestly the library doesn't seem that hard to understand, although it can be under documented at times - I found looking through the source very helpful.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: