First of all this technology is on track not to just assist you better, but to replace you.
Second it's not human. It is not explicitly bound by the morals and behaviors that make us human. Saying that it's not human is different from saying that it can be more intelligent than a human. This is the disquieting part. If restrictions aren't deliberately put in place it could probably give you instructions on how to murder a baby if you asked it to.
I think it's inevitable that humanity will take this technology to the furthest possible reaches that it can possibly go. My strategy is to Take advantage of it before it replaces you and hope that the technology doesn't ever reach that point in your lifetime.
I feel like the second part is a bit exaggerated. Humans inherently also aren't "made human" by something, there's no universal standard for morals and behaviors. You could also get reasonable "murder instructions" from an average person - it's not exactly forbidden knowledge, with how commonly it's depicted in media. Hell, I'm pretty sure there are detailed instructions on building a nuclear bomb available online - the reason why they're not viewed as some extreme threat is because the information isn't dangerous, having access to machines and materials required is.
As for the last paragraph - if the effects truly keep scaling up as much as people expect them to, I'd want society to be restructured to accommodate wide-reaching automation, rather than bowing down to a dystopian "everybody must suffer" view of the future.
Humans _are_ inherently "made human" by a long path of evolution. We have a set of conflicting heuristics that serve as our initial values and which are more helpful than harmful on average. We then use those to build our moral patchwork.
Pretty cool that evolution has helped us work out consistent and rational solutions to the ethics of gun ownership, abortion, and nuclear proliferation.
No only the basics related to survival have evolved instincts. modern concepts like abortion did not have millions of years of natural selection to evolve instincts.
However it is universally reviled to kill babies or rape toddlers and slice their faces off for food. This is identical across all cultures. The basest morals are universal and so is disgust, the high level ideologies like abortion are just made up.
These high level ideologies are attempts to make sense of moral instincts that only existed to help us survive. For example abortion. It's the extension of your instincts to avoid killing. At what point does decapitating the head of a fetus to abort the birth become disgusting? The third trimester or before that? You're trying to rationalize your base moral instincts into a codification of law. It's almost pointless because these moral instincts weren't evolved to be logically cohesive anyway. They're just like feelings of hunger and pain.
Evolution never had to answer that question so it didn't give us any answers. But decapitating a 1 year old baby? Now that's universally reviled because it effected the survival of the human race. It's so reviled that I may even get voted down for using this as an example. It's the perfect example though, the revulsion is so much stronger than abortion that some people can sense that it's not a cultural thing but more of an inborn instinct.
The practical consequence of abortion and decapitating a 1 day year old baby are virtually identical though. But even someone who is against abortion will still sense a gigantic difference. That sense is an illusion, a biological instinct that bypasses your rational thought.
In fact there exists people on this earth with zero morals and this can be observed from genetics and brain structure. The popular term is called psychopathy but the new politically correct term is called anti-social disorder. These people literally will feel nothing if they were slowly plunging a knife into your face.
How society structures itself will be more an emergent consequence of the aggregate behavior of individual actors fulfilling their own self fish needs then it will be a central power attempting to "restructure society". Because of this "suffering" is and always will be a valid possibility.
First of all this technology is on track not to just assist you better, but to replace you.
Second it's not human. It is not explicitly bound by the morals and behaviors that make us human. Saying that it's not human is different from saying that it can be more intelligent than a human. This is the disquieting part. If restrictions aren't deliberately put in place it could probably give you instructions on how to murder a baby if you asked it to.
I think it's inevitable that humanity will take this technology to the furthest possible reaches that it can possibly go. My strategy is to Take advantage of it before it replaces you and hope that the technology doesn't ever reach that point in your lifetime.