Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> floating point is definitely much faster than integer arithmetic these days, for most comparable operations.

Is that true of code running on CPUs? My GPU can run rings around the CPU at floating point, and probably itself at integer arithmetic, but I find it hard to believe any CPU is faster at a floating point version of an algorithm than an integer version of the same (assuming both are well, or equally badly, optimised).



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: