Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Performance in games typically doesn't have much correspondence with how good a GPU will be at compute. Rendering games, videos, etc is largely constrained by things like memory bandwidth, culling, image decoding and blending - all stuff typically done with special-purpose hardware that won't be very useful for compute or neural nets. Sometimes games or media have complex shaders that are entirely limited by compute, but it's not terribly common.

Apple's dedicated ML hardware is probably quite good, but we don't have any way to know how good without doing math on die size + power draw and running benchmarks.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: