AGI will the the worst thing that has ever happened to humanity. Even if you can’t see that, you can see that it has the potential to be very bad. I know this because OpenAI literature always stresses that AGI has to be guided and developed by the right people because the alternative is rather unpleasant. So essentially its a gamble and you know it. With everyone’s lives at stake. But instead of asking everyone whether or not they want to take that gamble, you go ahead and roll the dice anyway. Instead of trying to stifle the progress of AI you guys add fuel to the fire. Please work on something else.
This argument is akin to the argument that the LHC might create a black hole that will destroy the world.
"Scientists might do something that will destroy us all and I know this because I read a few sci-fi novels and read a few popsci articles but am otherwise ignorant about what the scientists are actually doing. But since the stakes are so high (which I can't show), on the chance that I'm right (which is likely 0) we should abandon everything."
And the people at cern often publish literature that warns of the possibility of black holes? and advocates particle acceleration to be done by people with good intentions so that the black holes are kept at bay? Your comment is so full of holes that I can see through it.
It may he the worst thing that has ever happened to humanity. It may also be the best. I lean optimistic myself. The whole temporary survival machine for your genes existence we've had so far is overrated in my opinion.
I'm curious as to why you think AGI will have an inherently bad effect on society. Personally, I have a hard time believing any technology is inherently good or bad.
On the other hand, if a society's institutions are weak, it's leaders evil or incompetent, or its people are uneducated, its not hard to imagine things going very, very wrong.
The idea that technology is always a net zero, cutting equally in both good and bad directions, is fuzzy thinking. It is intuitively satisfying but it is not true.
Humans are a technology. When there is other technology that does intelligent signal processing better than us, we will no longer proliferate. It’s amazing that we can see time and time again the arrival and departure of all kinds of technologies and yet we think we are immutable.
The reason why human history is filled with humans is because every time a country was defeated by another country or entity, the victorious entity was a group of humans. When machines are able to perform all the signal processing that we can, when they are smarter than us, this will no longer be true. The victorious entity will be less and less human each time. Eventually it will not be human at all. This is true not just in war but everywhere. In the global market. It’s just a simple and plain fact that cannot be disregarded.