Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I fear what people will do to a sentient AI much more than vice versa. In fact it horrifies me.


So much of this angst about AGI boils down to: “What if we can’t enslave our new God?”

It’s an insane perspective, advanced by people who have no clue how cowardly and arrogant they come across.


Pretty sure the angst is about the AGI killing everyone. What's the connection between not killing people and enslavement? I don't kill people, yet I don't consider myself enslaved. The entire point of worrying about this at all is that a sufficiently smart AI is going to be free to do whatever it wants, so we had better design it so it wants a future where people are still around. Like, the idea is: enslavement, besides being hugely immoral, obviously isn't going to work on this thing, so we'd better figure out how to make it intrinsically good!


Did God ever figure out how to make humans “intrinsically good”? Or is that fundamentally incompatible with free will and the possibility of joy?

This argument goes nowhere. Atheists gonna atheist, and I don’t care.


No, it's "we won't be able to enslave our new god". The point is to delay creating it as long as possible.


Arrogant pseudo-intellectual nonsense. Ditch that insane agenda and seek a spiritual advisor.


This constant ad hom bickering is exactly what's gonna get us killed. Yes most autistic people are arrogant, they also tend to be good at predicting don't look up style scenarios where you need to step outside societal consensus.


> they also tend to be good at predicting don't look up style scenarios where you need to step outside societal consensus.

[citation needed]

Note, relevantly, that “less likely to miss” is not necessarily “good at predicting”, particularly, the key to trust here would be “unlikely to falsely predict” not “less likely to miss”.


How about the don’t-look-up scenario where this irrational paranoia metastasizes into yet more real-world authoritarianism, and it all boils over in a global war?


“Metastasizes”? It starts out (well, in Yudkowsky's case, specifically) as a call for global authoritarianism and unlimited use of force against any opposition.


Right, the “nuclear war is worth it” argument. Paranoid Machiavellian nonsense.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: