Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Pixar is paying a massive premium; they probably are using an order of magnitude or two more CPUs than they would if they could use GPUs. Using a hundred CPUs in place of a single H100 is a greater-than-h100 style premium.


Would Pixar's existing software run on GPUs without much work?


It does already, at least on Nvidia GPUs: https://rmanwiki.pixar.com/pages/viewpage.action?mobileBypas...

They currently only use the GPU mode for quick iteration on relatively small slices of data though, and then switch back to CPU mode for the big renders.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: