Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

i'd like to see a source for that claim, regardless, pre-GCN GPUs had double precision support that wasn't purposefully knee-capped. 52 bits of mantissa would have been plenty.

as for nvidia knee-capping integer arithmetic on their GPUs.. i terribly doubt that is the case - pointer arithmetic (and hence memory access) requires integer operations, and i've seen very little evidence to suggest that there are any artificial issues with it.



There's no public source for the claim, it's what an AMD engineer told me via private e-mail (and I don't want to publish private mails for obvious reasons).

That said, there's a "fast" path on AMD GPUs for shader code which only requires 24 bits of integer precision. Those are actually exactly enough for GameCube/Wii GPU emulation, however I'm not sure if their shader compiler properly optimizes our code to use that path.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: