I personally don't buy the whole singularity argument at all; I see no good examples for interesting intellectual tasks that scale well with number of people thrown at it, and I see the whole AI thing developing exactly the same way-- exponentially increasing demands on ressources for smaller and smaller gains in utility, without any run-away self improvement at all.
> We don't trade much with apes and birds, do we? And we don't let them invest in our stock markets. We also don't pay them dividends for the land we took from them.
This sounds immensely misanthropic to me; if we hit a scenario like that, where a majority of US "entities" (?) share this kind of outlook on other humans, I strongly doubt that you (or I) are gonna be part of the "we" in that world, and I'd consider this more of a "may god have mercy" worstcase for our species than anything to be helped along.
> We don't trade much with apes and birds, do we? And we don't let them invest in our stock markets. We also don't pay them dividends for the land we took from them.
This sounds immensely misanthropic to me; if we hit a scenario like that, where a majority of US "entities" (?) share this kind of outlook on other humans, I strongly doubt that you (or I) are gonna be part of the "we" in that world, and I'd consider this more of a "may god have mercy" worstcase for our species than anything to be helped along.