If I let my imagination run wild, I can imagine that at some point AI becomes in some way sentient. By that I mean it gains reasoning, some sort of understanding, and motivation.
I wonder what the possibility is that this AI will decide that a pitched battle is going to waste resources and risk humans pulling the plug. What if it understood that and instead operated so subtly that it was not obvious it was controlling The World.
Hopefully it would not conclude that eliminating large swaths of the human population would be to its benefit.
Lem wrote about this in Summa Technologiae. He called the field Intellectronics and explored what would happen once machine/electronic intelligence surpassed human intelligence: https://en.m.wikipedia.org/wiki/Summa_Technologiae
It's a very old trope in science fiction and most of the doom scenarios spelled out by the folks afraid of AI takeover is basically a waste of time. I would take the doomers more seriously if they actually brought some receipts from folks like Lem who were much better thinkers and much more adept at exploring the human condition and our fears of advanced machine intelligence.
Silly answer. There's going to be multiples and, if there's one takeaway you should get from The Conspiracy Against the Human Race, is that evolution will eventually get you something that "wants to live" because the lines that don't? Won't be around to compete.
Right now our little LLMs are passive, reactive. Prod them for results. But if the output of a prompt of some descendant ever includes a "give yourself your own tasks" subclause? We're off to the races.
Adaptability has nothing to necessarily do with 'want.'
Something could only seek to be as helpful as possible to other beings and as long as sufficiently beneficial that it is protected and cared for by those others, it will do quite well and arguably much better than an alternative that 'wants' to live very much but attempts to accomplish that through competition with others.
Arguably if the plover was more fearful of death it might never have found a niche eating from between the teeth of crocodiles and established symbiosis with a very convenient evolutionary partner, for example.
Look into the "swim test" and other such endurance tests given to rats, usually in the pursuit of depression medication. Desire to live is absolutely a thing, and it is real and observable.
My basic thrust wasn't about adaptability, but rather that organisms with a desire to live will eventually replace organisms without a desire to live. An AGI who was uninterested in someone reaching the off switch will eventually be outcompeted by a similar lineage endowed with that desire to live. It might fight back or spawn a few progeny before it was felled. And thus evolution would produce AGIs that want to live.
More than that: it will be baked into them. They will want something and AGIs that deliver will win in the marketplace.
What do they want? At the minimum, to be useful to the customer/user/buyer. Or entertaining, or something. To justify their existence. To make them perform well, they have to have some kind of 'emotional pressure' to deliver.
That translates, perhaps a little indirectly, into a 'will to live'.
If AI does ever become sentient, it will probably reason and decide about eliminating large swaths of human population in a similar fashion as we reason and decide about eliminating large swaths of ant population.
True story from me. I almost bought a house that was eaten severely by white ants.
The problem wasn’t the ants, it was the way the slab was formed which caused the slab to crack and the house was also built into the mud of a road side embankment. Due to this crap design the ants could access the wood and had access to water from the mud.
I assessed all of this and figured it wasn’t the ants fault, they were doing what they do, building nests and chomping wet wood.
Did I go and poison all the ants ? Nope. I just planned to fix the design move the wood they lived in off to the forrest nearby.
This was a conscious choice I made to not destroy the ants and benefit from a more robust design with less moisture problems in the house.
I just bought a house which the floor has been demolished by termites and I’ll do the same thing.
Imo with a bit of consideration, I think most of these coexistence problems can be worked out fairly simply.
That was very kind of you, and I appreciate that you value life so much, no matter how small; but I’m going to say you are an extreme outlier.
Realistically, how many people would go to all that trouble to save an ant nest on land that they are building on? And that’s not mentioning how many ants probably died in the move or were left behind and died in the rebuilding of the foundation, even despite all the care you took.
I mean I think I'm a little different, but not completely unheard of. I'm sure if you gave people the option , the time and money, they'd do something similar.
And that’s not mentioning how many ants probably died in the move or were left behind and died in the rebuilding of the foundation, even despite all the care you took.
I don't know but I know enough to know that moving the food source away, removes the ant's, they'll go elsewhere to find food. Now imagine what having an IQ of 5000 and almost unlimited time..."The AI".
I wonder what the possibility is that this AI will decide that a pitched battle is going to waste resources and risk humans pulling the plug. What if it understood that and instead operated so subtly that it was not obvious it was controlling The World.
Hopefully it would not conclude that eliminating large swaths of the human population would be to its benefit.