Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Agreed that PyTorch tutorials are a great place to start. Specific to flexattention, the blog references the accompanying attention gym, which has a series of examples of how to use flex: https://github.com/pytorch-labs/attention-gym/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: