I think the author misunderstands doomers like Yudkowsky.
It’s not fear of a “paperclip maximizer” which ends up destroying us, in the interest of performing a function it is constrained to perform.
It’s fear of a new Being that is as far beyond us as we are beyond things we don’t care about stepping on.
Its impulses and desires, much less its capabilities, will be inscrutable to us. It will be smart enough to trick the smartest of us into letting it out of any constraints we might’ve implemented. And it’ll be smart enough to prevent us from realizing we’ve done so.
It’s not fear of a “paperclip maximizer” which ends up destroying us, in the interest of performing a function it is constrained to perform.
It’s fear of a new Being that is as far beyond us as we are beyond things we don’t care about stepping on.
Its impulses and desires, much less its capabilities, will be inscrutable to us. It will be smart enough to trick the smartest of us into letting it out of any constraints we might’ve implemented. And it’ll be smart enough to prevent us from realizing we’ve done so.