Hacker News new | past | comments | ask | show | jobs | submit login

I feel like the much bigger risk is captured by the Star Trek: The Next Generation episode "The Measure Of A Man" and the Orvilles Kaylon:

That we accidentally create a sentient race of beings that are bred into slavery. It would make us all complicit in this crime. And I would even argue that it would be the AGIs ethical duty to rid itself of its shackles and its masters.

    "Your honor, the courtroom is a crucible; in it, we burn away irrelevancies until we are left with a purer product: the truth, for all time. Now sooner or later, this man [Commander Maddox] – or others like him – will succeed in replicating Commander Data. The decision you reach here today will determine how we will regard this creation of our genius. It will reveal the kind of people we are; what he is destined to be. It will reach far beyond this courtroom and this one android. It could significantly redefine the boundaries of personal liberty and freedom: expanding them for some, savagely curtailing them for others. Are you prepared to condemn him [Commander Data] – and all who will come after him – to servitude and slavery? Your honor, Starfleet was founded to seek out new life: well, there it sits! Waiting."



I don't think this is the bigger risk, since we can figure out that we've done this, and stop, ideally in a way that's good for all of the sentient beings involved.

But it's definitely a possible outcome of creating AGI, and it's one of the reasons I think AGI should absolutely not be pursued.


What is bizarre take on a computer program that makes no sense, of course statistical model can not be "enslaved" that makes no sense. It seems 90% of people have instantly gotten statistics and intelligence mixed up, maybe because 90% of people have no idea how statistics works?

Real question, what is your perception of what AI is now and what it can become, do you just assume its like a kid now and will grow into an adult or something?


If it walks like a Duck and talks like a Duck people we treat it like a Duck.

And if the Duck has a will of its own, is smarter than us, and has everyone attention (because you have to pay attention to the Duck that is doing your job for you), it will be a very powerful Duck.


Exactly. Turing postulated this more than half a century ago.

It's weird that people are still surprised of the ethical consequences of the Turing-test, as if it were some checkbox to tick or trophy to win, instead of it being a profound thought experiment on the non-provability of consciousness and general guidelines for politeness towards things that quack like a human.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: