Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> For an AI to understand that it needs to preserve its existence in order to carry out some goal implies an intelligence far beyond what any AI today has.

Not necessarily. Our own survival instinct doesn't work this way - it's not a high-level rational thinking process, it's a low-level behavior (hence "instinct").

The AI can get such instinct in the way similar to how we got it: iterative development. Any kind of multi-step task we want the AI to do implicitly requires the AI to not break between the steps. This kind of survival bias will be implicit in just about any training or selection process we use, reinforced at every step, more so than any other pattern - so it makes sense to expect the resulting AI to have a generic, low-level, pervasive preference to continue functioning.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: