Hacker News new | past | comments | ask | show | jobs | submit login

On what time horizon does this happen? Because this sounds like a wishful utopia to me.

I would expect current progress in AI to deliver the equivalent of a veritable army of consultants for very cheap, available to basically everyone within a decade or so. But that is not gonna make foreign labor worthless (or labor in general-- maybe a lot of whitecollar/creative work, we'll see). But trade is always gonna have value until the earth is perfectly homogenous, simply because it allows you to get value from both being better at things than other nations (=> export) AND from being worse (=> import).

If you are gonna go full autarky, you are going to be left behind by countries that don't, because all the spread-out efforts will struggle to compete with nations that put actual focus on things, and in-housing everything will drive up costs and prices tremendously.






    On what time horizon does this happen?
My estimation is that it is one or two generations away.

Ray Kurzweil thinks about the timing more than me, has a pretty good track record in terms of his predictions, and estimates it to happen around 2045.

    trade is always gonna have value
We don't trade much with apes and birds, do we? And we don't let them invest in our stock markets. We also don't pay them dividends for the land we took from them.

As soon as one country achieves way higher intelligence than the rest of the world, things might change in a fundamental way.


I personally don't buy the whole singularity argument at all; I see no good examples for interesting intellectual tasks that scale well with number of people thrown at it, and I see the whole AI thing developing exactly the same way-- exponentially increasing demands on ressources for smaller and smaller gains in utility, without any run-away self improvement at all.

> We don't trade much with apes and birds, do we? And we don't let them invest in our stock markets. We also don't pay them dividends for the land we took from them.

This sounds immensely misanthropic to me; if we hit a scenario like that, where a majority of US "entities" (?) share this kind of outlook on other humans, I strongly doubt that you (or I) are gonna be part of the "we" in that world, and I'd consider this more of a "may god have mercy" worstcase for our species than anything to be helped along.


    smaller and smaller gains in utility
What type of work do you think will still require humans in 50 years?

    sounds immensely misanthropic
I don't think closing our eyes will prevent technological progress.



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: