Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Most arguments and discussions around AGI talk past each other about the definitions of what is wanted or expected, mostly because sentience, intelligence, consciousness are all unagreed upon definitions and therefore are undefined goals to build against.

Some people do expect AGI to be a faster horse; to be the next evolution of human intelligence that's similar to us in most respects but still "better" in some aspects. Others expect AGI to be the leap from horses to cars; the means to an end, a vehicle that takes us to new places faster, and in that case it doesn't need to resemble how we got to human intelligence at all.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: