Hacker News new | past | comments | ask | show | jobs | submit login

Our current obsession with super-intelligence reminds me the great oxidation event a few billion years ago. Super-photosynthesis was finally achieved, and then there was a great extinction.

If you believe that super-intelligence is unavoidable and a serious risk to humanity, then the sensible thing to do is to prepare to leave the planet, ala Battlestart Galactica. That's going to be easier than getting the powers that be to agree and cooperate on sensible restrictions.




If the human cooperation problem is unsolvable, I doubt creating a new human society with the same capabilities elsewhere would do much at all.


Humans and their ancestors have reproduced on earth for millions of years. I think the human cooperation problem is overstated. We cooperate more than fine, too well even to the detriment of other species.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: